How does a program know actually what a second is?

829 views

So if a program needs to wait 10 seconds for something, how it is actually calculating it if the processors speed continously changes? (And i’m talking about offline programs that does not use NTP)

In: 361

20 Answers

Anonymous 0 Comments

The program that calculates time always knows the CPU speed as well. If you have a constant speed of, say, 1 GHz, you know it takes one second to run through a billion cycles. When the CPU is throttled to, say, 500 MHz, the program knows that now it takes only half a billion cycles until a second has elapsed. This may or may not happen very fast and all the time, but it all stays in sync.

You are viewing 1 out of 20 answers, click here to view all answers.