Why is cesium used to define a second, as opposed to other atoms that might be more common like Hydrogen or even Oxygen? Also how do we know that’s equal to one second if seconds are arbitrary?

569 views

Also I didn’t know what to flair this sorry. Figured maybe physics tech or math but I wasnt sure.

Edit: apparently it was physics. *shrug*

In: Physics

3 Answers

Anonymous 0 Comments

You might have it a little backwards: the second was defined first, by a long while. Originally, the second was defined based on the rotation of the Earth – 24 hours divided by 60 into minutes, and then again into seconds. But the rotation of the Earth isn’t quite that precise, and so scientists wanted to find a more precise definition of a second, one that could be true anywhere in the universe and one that would be agreed upon by anyone who could be bothered to find that measurement.

Cesium has been used in atomic clocks because the frequency of the microwave that it emits is really, really consistent when evaporated and sent spinning by the mechanism of the atomic clock. Most atoms aren’t this consistent and they’d produce a very wide range, but cesium holds close enough to that one value that the best atomic clocks won’t lose a second over *billions* of years. Scientists figured out how many of those cesium transitions added up the closest to that pre-defined time that we’ve been calling a second (which turns out to be 9,192,631,770), and they re-defined the second to refer back to that value. So the second that we’ve always used is more-or-less the same, only now it’s defined incredibly precisely by a value that will not change in any measurable way, which means that anyone who builds a good atomic clock will have exactly the same time measurements as anyone else. It’s a silly and large number by our typical standards, but it’s reproducible and consistent, which is the key to a good measurement.

You are viewing 1 out of 3 answers, click here to view all answers.