I stuck a battery powered watch in my safe a couple months back and just pulled it back out to find that it’s running about two minutes behind true Atomic time. I figure that the only thing the watch is supposed to do is know exactly how long a second is, but it’s now off by more than 100 seconds.
So my question is, does dark and cold temperatures make the Watch run slower? How does that work?
In: 2
2 months is (61 x 24 x 60 x 60) seconds. As a proportion of this 100 seconds works out as just under 20 parts per million or 0.0002%.
The standard clock crystals in watches are typically rated as +/- 50 parts per million (or ppm as it’s normally written). In other words you watch could have been twice as wrong and still been considered to be within the expected accuracy.
In other words that is about the level of error you should expect over 2 months. A minute a month sounds bad but when you work out the maths it actually represents a very small error.
It’s the equivalent of a 1km long tape measure being accurate to 2mm (or in American a 1 mile tape measure being accurate to around 1/8″)
Edit:fix format issues
I assume your watch is electronic and not mechanic, even something similar is going on with a mechanic watch.
All electronic changes value and how it behaves, dependent of temperature and a watch has to be very precise, so every component and their values is selected and calibrated, so the watch is most accurate at the normal temperature it gets to, while on your arm. When the watch is in your drawer or some place with less heat, does the 10 deg in différance, mean that the electronic components are not working under the optimal temperature and therefor is less accurate.
Latest Answers