For example, a transistor sends current to represent two 1’s. How does the hardware know that it’s two 1’s and not one or three? That current exists for a certain amount of time, so how does the hardware time it so that it knows when to separate the constant current into the correct amount of bits, and what is that timing? Or is it not timing at all, but a total sum of current it receives before counting one bit then resetting and waiting for the current to reach that sum again to count another bit? But then, how does it count 0’s if there is no current? Surely that would have to involve some sort of timing? If there is two seconds of no current, how many 0’s is that?
Hopefully I have explained my inquiry sufficiently.
In: Technology
This is the clock speed you always hear about. Plus, processors don’t load one stream of 1 and 0. They pull whole blocks of 1 and 0 at the same time. For example, you want to store a 32-bit value in the CPU’s register. You issue one command that does that all at once in one clock cycle. The data paths are like a very wide highway, not a one-lane country road. The blocks of bits move in chunks every clock cycle.
Latest Answers