How does a program know actually what a second is?

809 views

So if a program needs to wait 10 seconds for something, how it is actually calculating it if the processors speed continously changes? (And i’m talking about offline programs that does not use NTP)

In: 361

20 Answers

Anonymous 0 Comments

There is a system clock attached to every modern computer, and for internet-connected operating systems, they are frequently synchronized with atomic clocks run on the internet, through a service called ‘network time protocol’. Programs running under an operating system will ask the OS for the time using a system call.

Anonymous 0 Comments

Eli5 version:
Your computer has a clock inside. That clock is far more precise than just seconds, it has to be, but it can easily calculate seconds from its more precise units.

Anonymous 0 Comments

There is a type of rock which when hit with electricity begins to vibrate. And it always vibrates with the same speed. This tiny rock (crystal) is part of every computer.

Programmers pre-set how many times is this crystal supposed to vibrate in a second. So programs can just count the vibrations and know when a second has passed.

Anonymous 0 Comments

The same way electric watches do – a crystal oscillator.

When you pass a voltage across the crystal, it vibrates in a very very specific frequency. Millions of times per second, depending on the crystal. The computer passes a tiny charge through the crystal and then counts how many times it vibrates, since it knows how many times it vibrates per second it can tell time.

This is stored in a separate component on the motherboard (generally), known as ‘the clock’. That way all programs have access to the same time measurement.

Anonymous 0 Comments

You accidentally stumbled upon one of the hardest issues in computer science. Dealing with time, time zones, time calculations, clock drift, etc, is a nightmare.

Anonymous 0 Comments

It doesn’t. However, the program can ask the OS, but it doesn’t know that either. However, the OS has access to timers and clocks on the motherboard of the computer, so it can use those to figure out the time, and then cause a delay, or return the time, or do whatever the program needs to do.

Anonymous 0 Comments

There is a chip inside that has a crystal that pulses a certain number of times per second, usually in the millions or billions. That pulsing is connected to an interrupt, which is connected to a counter. When the computer turns on, the crystal starts counting, and it knows how many counts equals 1 second.

In order to store time between power turning on and off, the memory of what time it is needs to be saved. Most computers (motherboards) have a small battery that keeps a circuit on even when the computer is off to keep time.

If there is no saved time, the device can connect to the internet (or another device) and ask “what time is it?” And then keep count from there.

Anonymous 0 Comments

For the five year old:

Computers have a clock inside them.

For the 10 year old:

Computers have a clock inside them and there is a special function that programs can call to get the current time.

For the 14 year old:

When programming your application you can call a function to get the the current or elapsed time. You can also program it so it asks the computer to “call it back” when a certain amount of time has elapsed.

The computer keeps track of the current time in 2 ways. First there is a special chip that keeps track of the time, and it has a small battery which keeps power to it. The computer references this when it’s turned on. This is why it knows the current time even when power is removed. Second, when connected to a network or internet, it sends a request for the current time, and the closest time server responds with the current time.

For the 16 year old:

(Above +) Computers also have a special circuit called a “clock” which operates at a know frequency. The clock “ticks” at the speed in which a computer can process instructions at the lowest level (CPU instructions, not application code). Because the frequency is known, the number of ticks per second is also known. Most CPUs and microprocessors use a quarts crystal, which vibrates at a known frequency, to generate these ticks.

Anonymous 0 Comments

How did they define a second in terms of crystal oscillations in the first place?

Anonymous 0 Comments

Next question for you to ponder, how do you know a second for you is the same amount of time as a second for me?