This is just something I've been wondering just out of curiosity. I apologize if this doesn't fall under the category of this forum, and if so feel free to divert me :)
Whenever a computer does anything, it always handles a sequence of bits. My question is: how does it actually measure what bit goes in what place? I'm curious of the electric current that gets sent into the transistors, and how these are actually placed correctly in the binary stream. I understand the concept of the transistors and how a threshold voltage determines whether or not a bit is 0 or 1, but how are these bits actually placed in sequence?
Say if we have a binary sequence of 00010010001. How does whatever component handling this stream know how "long" it has to wait with no current before it places a 0-bit, and then moves on? According to my understanding, for the first three 0-bits, there will be no current going into the transistors. But does it have to "wait" for a set amount of time before it determines that a single bit is 0? I'm generally just confused as to how a computer knows the difference between a single 0-bit, and (for example) 10 different 0-bits. When does a computer know to set another bit?. Appreciate any feedback! :)