This a follow up to my earlier question on the lengths of months and weeks.

How did we decide that a second is 1000ms but a minute is 60s and so on

In: 0

So back in the day, the idea of a decimal system (base 10) was completely foreign. It was easiest to divide and multiply by smaller numbers. Doubles are easy, threes aren’t so bad either. Two threes gets you six. Two of those is a twelve. Five of those is sixty.

Because of this, numbers with lots of divisors became popular. 24, 60, 120, 360.

Because there’s *almost* 360 days in the year, 360 was picked to be a circle. One degree is one day (almost).

24 for hours in a day is arbitrary, but there’s also some convenient physical attributes to it. 360/24 is 15, and if you put your arm out straight and stick up your thumb, the base of your hand to the tip of your thumb will be 15 degrees. Concidence of human anatomy and proportions, it’s actually the same for pretty much everyone. But now you have a gauge for how long an hour is. Stick out your thumb, and wait for the sun to move from your thumb to the bottom of your hand. I’m pretty sure, though I have no source, that this is why 24 was chosen instead of 60 like they did for minutes and seconds.

Minutes and seconds were just dividing by 60, then 60 again.

It’s all totally arbitrary though, except for days in a year. You could say a day is ten hours long, an hour is ten minutes, and a minute is ten seconds. The value of all of those would be different from what we now use, but it wouldn’t fundamentally change anything.

We did not decide that a second is 1000 milliseconds. The “milli”, when preceding a unit, always indicates 1 thousandth of the whole unit. So it’s more like we had the second as an unit, and as we needed a finer scale we kind of picked the prefixes that seemed useful in that context, so milli, micro, nano, pico, femto, etc.

Regarding the 60 seconds in a minute, that has historical reasons. Basically because the ancient time measurements that stuck around were built that way.

See

Before people were comfortable with fractions, they wanted to use numbers that were easily divided. In general these numbers are:

* 12: divisible by 2, 3, 4, and 6.

* 60: divisible by 2, 3, 4, 5, 6, 10, 12, 15, 20, and 30.

* 360: divisible by 2, 3, 4, 5, 6, 8, 9, 10, 12, 15, 18, 20, 24, 30, 36, 40, 45, 60, 72, 90, 120, and 180.

Days, months, and years are objective features of reality and not something we can control (although, conveniently, there *are* approximately 360 days in a year).

The Ancient Mesopotamians divided each day into 12 hours. (60 was considered, but there are very few tasks that take less than a twelfth of a day, and 60 is a large enough number that people would be in danger of losing count of what hour it was.)

Once we got mechanical clocks (as opposed to sundials), the night ended up divided into 12 hours too, for a total of 24 hours between one dawn and the next.

Much later, as science started happening, the Medieval Mesopotamians decided that more precise units of time should be based on the number 60. An hour could be divided into 60 primary-minutes, a primary-minute could be divided into 60 secondary-minutes, a secondary-minute could be divided into 60 tertiary-minutes, and so on.

The minutes and seconds stuck (conveniently, a second is approximately the same length as the interval between human heartbeats), but the thirds weren’t very useful since they were so brief as to require complicated instruments in order to measure; they never saw widespread use. So in modern times when we decided to standardize everything to a single metric unit, we chose seconds as the base unit of time. Things like milliseconds and microseconds aren’t really separate units; “milli” just means “thousandth” and “micro” just means “millionth”.

Earth time is based on our planets movement through the solar system. We live in a giant clock