eli5: At the most basic level, how is a computer programmed to know how long a unit of time is?

899 views

eli5: At the most basic level, how is a computer programmed to know how long a unit of time is?

In: 2404

39 Answers

Anonymous 0 Comments

>how is a computer programmed to know how long a unit of time is?

It’s the exact opposite, because no “programming” is needed. Just like people have a [Circadian Rhythm](https://en.wikipedia.org/wiki/Circadian_rhythm), computers have internal clocks too. Every instruction the computer executes takes a [specific number of clock cycles](http://6502.org/users/obelisk/6502/reference.html#LDA).

Very rarely, you can use this fact directly in code to make small delays. But most of the time it’s way too hard to count the instructions (and all the various paths that the code can take), so nearly every CPU comes with a register that [counts clock pulses](http://bauhausinteraction.org/opl/arduino-techniques-getting-rid-of-delays/) for you. You can read that to figure out when to do something, or set it to interrupt you every so often (usually a few milliseconds). This interrupt is often called a [tick](https://superuser.com/questions/101183/what-is-a-cpu-tick). Instead of “you run a program and it takes up the whole computer until it’s done”, it allow the computer to “look like” it was running many programs. But each one was only running for a small slice of time, then the others were running, making them “look” like they were all running at once. (Today’s computers still do that, but they also have many CPUs, so often many programs are running at once.)

You are viewing 1 out of 39 answers, click here to view all answers.