There’s two reasons, both very seperate, and both very real, and they can work together, or oppositely.
The first is, as many have said, the power cycle. US power is supposed to be at 60Hz, so the clocks get their timing off that somewhat. Elsewhere it’s 50hz, like in Europe, where a dispute in the Balkans 4 years ago caused clocks to [lose about 6 minutes](https://www.theguardian.com/world/2018/mar/08/european-clocks-lose-six-minutes-dispute-power-electricity-grid) in total because of frequency mismatches.
The second is even simpler. most clocks are REALLY CHEAP and really s**t. The clock in your PC? it’s less accurate than a $5 wall clock powered by a AA. If they lose a minute over 5 months, who cares? In a lot of the world you’re going to be changing it anyway for daylight savings/summer time twice a year, and you really won’t notice if its had 90 seconds of drift over 6 months (which is half a second a day). And that assumes your power is perfect, and you don’t have a blip that’ll reset it anyway. So why are they going to spend money on making sure it’s all that accurate?
Latest Answers