How does a program know actually what a second is?

609 views

So if a program needs to wait 10 seconds for something, how it is actually calculating it if the processors speed continously changes? (And i’m talking about offline programs that does not use NTP)

In: 361

20 Answers

Anonymous 0 Comments

Most programs either ask the OS “What time is it?” or tell the OS “I’m done working for now, you can put me to sleep, but please wake me up in 10 seconds.”

But how does the OS itself know? Or what if you’re writing a program that runs directly (“bare metal”), without any OS?

There’s actually a separate circuit that counts clock pulses. In some “simple” systems, those clock pulses might come from the same clock used to run the CPU. This design makes sense if the CPU always runs at the same speed, or if the speed changes “predictably” (i.e. the program is the only part of the system capable of commanding a speed change).

In systems where the CPU speed can change “unpredictably,” usually the system includes a separate clock that doesn’t change. In older PC’s (approximately 1980-2005), this was the [Intel 8253](https://en.wikipedia.org/wiki/Intel_8253), but it has been replaced by more modern [APIC](https://en.wikipedia.org/wiki/Advanced_Programmable_Interrupt_Controller) or [HPET](https://en.wikipedia.org/wiki/High_Precision_Event_Timer).

The actual source of clock pulses for high-speed applications (like a CPU clock) is almost always a [crystal oscillator](https://en.wikipedia.org/wiki/Crystal_oscillator). Simple circuits that don’t need high-speed timing can measure the charging and discharging of a capacitor instead, often with the famous [555 chip](https://en.wikipedia.org/wiki/555_timer_IC).

You are viewing 1 out of 20 answers, click here to view all answers.