How does a program know actually what a second is?

601 views

So if a program needs to wait 10 seconds for something, how it is actually calculating it if the processors speed continously changes? (And i’m talking about offline programs that does not use NTP)

In: 361

20 Answers

Anonymous 0 Comments

Computers have a clock that runs separately from everything else. It If it ever needs to know that time it is, it can ask that clock system.

The clock system is constant. Even if the rest of the computer makes changes in processing speeds, the clock is unaffected.

Computers also have interrupt systems. It can be scheduled to stop the process a computer is working on and have it run another process before continuing.

An interrupt can be something like pressing or releasing a keyboard key or a scheduled event that happens at a particular time.

It’s still possible for timing to get *a little* off. The clock system isn’t 1000% perfect. It can be a bit too slow or fast.

The interrupt system can also have its own problems. Most won’t be able to schedule two events at the exact time and might have to queue them up instead. (I remember in grade school 15+ years ago, you could sometimes type an entire word on the computer before the first letter appeared on screen.)

But we’re probably talking about issues that are only off by tiny fractions of a second these days. Good enough for nearly any use case.

You are viewing 1 out of 20 answers, click here to view all answers.