So if a program needs to wait 10 seconds for something, how it is actually calculating it if the processors speed continously changes? (And i’m talking about offline programs that does not use NTP)
The computer has a clock built into it which is a crystal that vibrates at a known frequency. It also has a chip which knows that every time the crystal vibrates a certain number of times, a second has passed.
Latest Answers