How does a program know actually what a second is?

607 views

So if a program needs to wait 10 seconds for something, how it is actually calculating it if the processors speed continously changes? (And i’m talking about offline programs that does not use NTP)

In: 361

20 Answers

Anonymous 0 Comments

In addition to the RTC mentioned by a few people, it is important to realize that a CPU does not have a *single* speed.

It is very common for there to be one or two medium-frequency “master” clocks, which are then multiplied and divided to get a clock for the processing part, a different one for the memory interface, yet another for the PCIe bus, one for USB, and also one for timekeeping. A modern CPU will have *dozens* of clocks, all set up in a so-called “clock tree”.

When a processor’s speed changes it will *only* change the processor clock’s multiplier, while all the other clocks remain as-is.

You are viewing 1 out of 20 answers, click here to view all answers.