ELI5, How are precision calibration tools, themselves calibrated?

476 views

Feels like a chicken and egg senario. Let’s say I get my torque wrench from work sent off to be calibrated, and that’s calibrated with something itself needs to be calibrated, and so on and so fourth. How’s that figured out?

In: 430

27 Answers

Anonymous 0 Comments

Very good book out there by Simon Winchester called The Perfectionists that covers precision, precision instruments and measuring, and how it is all arrived at.

Anonymous 0 Comments

Torque wrench can be calibrated with a bucket of water and a tape measure.

Sure you need to know how much water is in the bucket, so we use a measuring jug. The measuring jug is calibrated using an age old system of measuring by different amounts by either volume, size or weight – but it mostly boils down to the one true kilogram.
I can’t remember where it is in the world but there is a perfect example of 1kg somewhere.

A tape measure also, they’re not accurate anyway. But let’s say we used a micrometer to measure 30cm then it’s usually calibrated using the tool before it lol, and that before the one before etc.

Nowadays with modern machining, these are automated processes and probably most precision tools are ‘just’ machined to spec.

I was watching a documentary about the ancient Egyptian tombs’ sarcophagus and how they enclosed them, and locked them using copper rods it was quite interesting given the time period and ofcourse the crazy thick all up to spec granite they used that fit and slid together like a glove.

I honestly would like to know how they spec that shit back then

Anonymous 0 Comments

There’s one other complication that I think is still worth mentioning for an ELI5 and that is additive errors. Say you have a standard that is calibrated for a second but you then want to use that to calibrate something that is measuring days. Since there is some uncertainty associated with each second of the standard, then you end up with a greater uncertainty in the day measurement that is calibrated against a lot of seconds. Usually that is still more than good enough.

I ran into this when an idiot Quality Assurance Engineer insisted we calibrate a 300 m steel tape by sending it down to a nuke plant where they could calibrate length, traceable back to official standards. But they couldn’t measure the whole thing at once so had to do it in sections so the uncertainties were greater. We didn’t tell him and just ticked the box. We also didn’t tell him we didn’t correct for thermal expansion or the stretching of the tape as you hung it down a well. But as long as are working tapes were calibrated against that one, we were internally consistent and that was what really mattered.

Anonymous 0 Comments

I had a similar thought about straight edges. How did the first perfectly straight line get created? How did they know it was straight if there was nothing to compare it to… etc etc

Anonymous 0 Comments

Is this why Garrus is ALWAYS doing calibrations?

Anonymous 0 Comments

2147_m has a great explanation. However, one thing I want to point out that they didn’t mention, is that the “highest level” standard you can compare against is not based on a physical object. All measurements are derived from concepts. 1 meter (the standard unit of length, even feet are derived from meters) is defined as the distance light travels in a vacuum in 1/299,792,458 seconds. 1 second is defined as the amount of time it takes the energy level in a cesium atom to oscillate 9,192,631,770 times. And so on, all units of measurement are based on concepts, not physical items. This means that, in principal (not necessarily in practice) anyone can perform a calibration without having to send their equipment to compare it to other equipment. It also means the measurements will never change over time, which is something that will happen to any physical object.

Veritasium has three great videos on how the kilogram went from being defined as a physical object, to being defined as a concept. The first two videos are about how scientists were attempting to use two different methods to acheive this, the last is how they actuall went about acheiving it and how it’s defined today. They are well worth a watch if you’re interested:

Anonymous 0 Comments

Each tool is calibrated by another, more precise tool. Eventually you work your way back to a “master standard”, which is some sort of tool or measurement which effectively defines the measurement. For instance for units of length; a factory making rulers creates the marks based on a set of calipers or micrometers. Those calipers or micrometers are calibrated to a set of reference bars which are extremely precisely ground, and certified to be a certain length. The factory which makes the reference bars uses an even more precise set, which comes from a laboratory with a master “inch” measurement. Currently, length is defined based on precise tools for measuring the wavelength of light, so the precision of the light wavelength measurement sets the precision of the everything down the line. In the olden days, there was actually a platinum bar that was considered to be an inch long. It was kept locked in a special vacuum filled safe, and every few years they would take it out and use it to calibrate other instruments.

0 views

Feels like a chicken and egg senario. Let’s say I get my torque wrench from work sent off to be calibrated, and that’s calibrated with something itself needs to be calibrated, and so on and so fourth. How’s that figured out?

In: 430

27 Answers

Anonymous 0 Comments

Very good book out there by Simon Winchester called The Perfectionists that covers precision, precision instruments and measuring, and how it is all arrived at.

Anonymous 0 Comments

Torque wrench can be calibrated with a bucket of water and a tape measure.

Sure you need to know how much water is in the bucket, so we use a measuring jug. The measuring jug is calibrated using an age old system of measuring by different amounts by either volume, size or weight – but it mostly boils down to the one true kilogram.
I can’t remember where it is in the world but there is a perfect example of 1kg somewhere.

A tape measure also, they’re not accurate anyway. But let’s say we used a micrometer to measure 30cm then it’s usually calibrated using the tool before it lol, and that before the one before etc.

Nowadays with modern machining, these are automated processes and probably most precision tools are ‘just’ machined to spec.

I was watching a documentary about the ancient Egyptian tombs’ sarcophagus and how they enclosed them, and locked them using copper rods it was quite interesting given the time period and ofcourse the crazy thick all up to spec granite they used that fit and slid together like a glove.

I honestly would like to know how they spec that shit back then

Anonymous 0 Comments

There’s one other complication that I think is still worth mentioning for an ELI5 and that is additive errors. Say you have a standard that is calibrated for a second but you then want to use that to calibrate something that is measuring days. Since there is some uncertainty associated with each second of the standard, then you end up with a greater uncertainty in the day measurement that is calibrated against a lot of seconds. Usually that is still more than good enough.

I ran into this when an idiot Quality Assurance Engineer insisted we calibrate a 300 m steel tape by sending it down to a nuke plant where they could calibrate length, traceable back to official standards. But they couldn’t measure the whole thing at once so had to do it in sections so the uncertainties were greater. We didn’t tell him and just ticked the box. We also didn’t tell him we didn’t correct for thermal expansion or the stretching of the tape as you hung it down a well. But as long as are working tapes were calibrated against that one, we were internally consistent and that was what really mattered.

Anonymous 0 Comments

I had a similar thought about straight edges. How did the first perfectly straight line get created? How did they know it was straight if there was nothing to compare it to… etc etc

Anonymous 0 Comments

Is this why Garrus is ALWAYS doing calibrations?

Anonymous 0 Comments

2147_m has a great explanation. However, one thing I want to point out that they didn’t mention, is that the “highest level” standard you can compare against is not based on a physical object. All measurements are derived from concepts. 1 meter (the standard unit of length, even feet are derived from meters) is defined as the distance light travels in a vacuum in 1/299,792,458 seconds. 1 second is defined as the amount of time it takes the energy level in a cesium atom to oscillate 9,192,631,770 times. And so on, all units of measurement are based on concepts, not physical items. This means that, in principal (not necessarily in practice) anyone can perform a calibration without having to send their equipment to compare it to other equipment. It also means the measurements will never change over time, which is something that will happen to any physical object.

Veritasium has three great videos on how the kilogram went from being defined as a physical object, to being defined as a concept. The first two videos are about how scientists were attempting to use two different methods to acheive this, the last is how they actuall went about acheiving it and how it’s defined today. They are well worth a watch if you’re interested:

Anonymous 0 Comments

Each tool is calibrated by another, more precise tool. Eventually you work your way back to a “master standard”, which is some sort of tool or measurement which effectively defines the measurement. For instance for units of length; a factory making rulers creates the marks based on a set of calipers or micrometers. Those calipers or micrometers are calibrated to a set of reference bars which are extremely precisely ground, and certified to be a certain length. The factory which makes the reference bars uses an even more precise set, which comes from a laboratory with a master “inch” measurement. Currently, length is defined based on precise tools for measuring the wavelength of light, so the precision of the light wavelength measurement sets the precision of the everything down the line. In the olden days, there was actually a platinum bar that was considered to be an inch long. It was kept locked in a special vacuum filled safe, and every few years they would take it out and use it to calibrate other instruments.