what is the science behind calibration weights ? How can we be so sure that the weight we’ve been using to calibrate is exact and not a few decimals off, which would make the concept of density completely wrong ?

573 views

what is the science behind calibration weights ? How can we be so sure that the weight we’ve been using to calibrate is exact and not a few decimals off, which would make the concept of density completely wrong ?

In: Physics

4 Answers

Anonymous 0 Comments

Calibration weights are correct by definition, within a known tolerance.

Unlike a lot of other basic units, mass is annoyingly difficult to quantify in any absolute terms that we can usefully employ. Saying “1kg is equal to 1 cubic decimeter of water at such and such temperature and pressure” (the original definition) is perfectly rigorous but really hard to implement, because you’ve just turned the problem into one of measuring volume instead of mass.

So in the late 1890s they switched to a physical artifact…they made a platinum iridum bar and *defined* it to be 1kg. Later that got replaced by a cylinder that was used to calibrate everything else. It can’t be wrong, because it’s the definition. Then you just compare other weights to *the* kilogram and keep track of the tolerances of your measuring devices to make calibration weights for general use. The problem with this approach is that if your reference mass changes you get into a fight about which one is right.

Very recently (2019) we redefined the kilogram in terms of the Planck constant, the second, and the meter (which all have really vigorous definitions that don’t depend on a physical artifact). This lets us check reference weights against other things we can directly measure.

You are viewing 1 out of 4 answers, click here to view all answers.