For example, a transistor sends current to represent two 1’s. How does the hardware know that it’s two 1’s and not one or three? That current exists for a certain amount of time, so how does the hardware time it so that it knows when to separate the constant current into the correct amount of bits, and what is that timing? Or is it not timing at all, but a total sum of current it receives before counting one bit then resetting and waiting for the current to reach that sum again to count another bit? But then, how does it count 0’s if there is no current? Surely that would have to involve some sort of timing? If there is two seconds of no current, how many 0’s is that?
Hopefully I have explained my inquiry sufficiently.
In: Technology
You’ve probably heard the term *overclocking* before. A couple pins coming into and going out of the chip are the *clock cycle*, which do exactly what you’re asking about. All they do, all the time, is alternate between 0 and 1. That clock cycle sets the speed for every other signal coming in. That’s what the chip compares against to know that it’s two 1s and not one long 1.
Overclocking, as the name suggests, means speeding up that clock cycle so the chip processes more quickly. That also uses more energy and produces more heat.
Latest Answers