Because the minutes/hours system is much older and based on the sumerian number system.
They were obsessed with the number 12, and 60 being a multiple of 12, 5 and 10 seemed usefull to them. So they used that as a base for their number system.
They also split their currency like that 1 Talent=60 Mina, 1 Mina= 60 Shekel
The civilizations that first divided the day up into smaller units of time (at least in the West) used base 12 (Egyptians) and base 60 (Babylonians/Sumerians) number systems instead of a base 10 number system and their systems were adapted by the Greeks.
https://www.scientificamerican.com/article/experts-time-division-days-hours-minutes/#
The system is ancient but has stuck around because 12 and 60 are *superior highly composite numbers*. Basically there are an unusually high amount of numbers that evenly divide into them which makes it super convenient. Hours is 12 btw (or 24 in some regions), not 10.
1/2, 1/3, 1/4, 1/5, 1/6, 1/10, 1/12, 1/15, 1/20, 1/30 all divide out nicely from 60. Compare to 10 which is just 1/2 and 1/5.
https://en.m.wikipedia.org/wiki/Highly_composite_number#Examples
Milliseconds are in decimal because base 60 doesn’t actually make any sense to use at that scale. We would need 60 unique number symbols because of the infinite precision problem. Milliseconds and frankly our entire numerical system COULD have been in base 12 though and there’s a legitimate push to make that happen. Also base 6 which is again a *superior highly composite number* but mostly a meme movement because it’s called “seximal.”
Our time system originated in Summeria and was spread from there to basically every other civilization. Summeria used a base-12 number system (single digits went 0-11 for 12 total single digit numbers). 60 is much less an oddball in base 12 math. Their number system never caught on, but their time system did.
Because we cam up with the various units as we could measure them.
Days are old. Everyone can notice a sunrise or sunset.
Dividing days into a handful of equal parts also is relatively easy, requiring only the most primitive of timekeeping tools, so “hours” in some for or another have been a thing since ancient time.
Minutes and seconds were only something that anybody started actually using once we had clocks that were accurate enough.
Units smaller than a second are fairly recent requiring extremely accurate clocks and by the time we could make those reliably we already had gone metric.
Older units before the metric system and decimalization, were often based on 12 or multiples of 12.
The advantage of that was you can divide 12 by 1, 2, 3, 4 and 6 and get a whole number.
If a big units is made up out of 12 small units you can have halves, thirds, quarters and sixths and still end up with whole units.
You can’t split a modern Dollar into thirds among three people, but plenty of pre-decimal currency units could do that easily.
This is an obvious advantage if you in many situations.
More modern units of measurement were decided that upon the idea that powers of 10 make many types of calculations easier and that fractions are okay.
Minutes as 1/60 of an hour makes a lot of sense. You can have half an hour, a third of an hour, a quatre of an hour a fifth of an hour, a sixth of an hour and so on.
Minutes were originally something along the lines of “first minute part of an hour” and seconds were the “second minute part of an hour”.
You have the same minutes and second division minutes and seconds of an angle. 30 arcminutes are half a degree.
“Thirds” as a unit that is one 60th of a second was considered by people in the past, but by the time anyone had any device actually capable of measuring something that small accurately, the trend had gone towards decimalization.
During the time of the French Revolution people tried to introduce decimal time as a replacement for the old hours, minutes and seconds, but unlike all the other units time kept its non decimal nature for units larger than a single second.
The second now is an integral part of the metric system. So we are stuck with it unless we want to change all the units we have. And all the other natural units of time like day or year are not going to fit into a power of ten of a second or each other. So there is no chance of really getting a fully decimal time until we reach the stars and leave out earth based time units behind and adopt stardates or something.
You may feel free to use ‘thirds’ and ‘fourths’ instead of milliseconds, but you may have to explain to people what those are.
Do you mean dozen for hours? As others have said it has to do with the number base that was used at the time. Why not 60 hours to a day and then break that into 60 minutes and 60 seconds? I don’t know.
As for decimal for milliseconds, that’s an interesting question. My first thought is when it goes to a sub unit then decimals makes sense. But the pound currency used to have 240 pence to a pound. So I’m thinking that we didn’t have the precision back then to measure anything less than a second, and no need to anyway. By the time we had the ability and need to, then decimalization was the way it was done.
When have you used decimal hours? Other than for billing, I’ve never really heard of anyone using decimal hours.
Do you mean to ask why hours just keep counting up? They don’t, they end at 24. 24 hours in a day. 28 to 31 days in a month and so forth. Also months can be turned into a standardized 30 days for billing/prorating.
The convention of 60 is because there are roughly 360 days in a year, so the ancients used multiples/divisors of that. Also why there is 360 in a circle. Now that I think about it, using 60s means that both a minute and a second are a perfect integer divisor of a whole standardized 360 day year. Of course the ancients didn’t realize that they were wrong by 5 and a quarter days on their year timing.
Latest Answers