ELI5, How are precision calibration tools, themselves calibrated?

461 views

Feels like a chicken and egg senario. Let’s say I get my torque wrench from work sent off to be calibrated, and that’s calibrated with something itself needs to be calibrated, and so on and so fourth. How’s that figured out?

In: 430

27 Answers

Anonymous 0 Comments

Each tool is calibrated by another, more precise tool. Eventually you work your way back to a “master standard”, which is some sort of tool or measurement which effectively defines the measurement. For instance for units of length; a factory making rulers creates the marks based on a set of calipers or micrometers. Those calipers or micrometers are calibrated to a set of reference bars which are extremely precisely ground, and certified to be a certain length. The factory which makes the reference bars uses an even more precise set, which comes from a laboratory with a master “inch” measurement. Currently, length is defined based on precise tools for measuring the wavelength of light, so the precision of the light wavelength measurement sets the precision of the everything down the line. In the olden days, there was actually a platinum bar that was considered to be an inch long. It was kept locked in a special vacuum filled safe, and every few years they would take it out and use it to calibrate other instruments.

Anonymous 0 Comments

2147_m has a great explanation. However, one thing I want to point out that they didn’t mention, is that the “highest level” standard you can compare against is not based on a physical object. All measurements are derived from concepts. 1 meter (the standard unit of length, even feet are derived from meters) is defined as the distance light travels in a vacuum in 1/299,792,458 seconds. 1 second is defined as the amount of time it takes the energy level in a cesium atom to oscillate 9,192,631,770 times. And so on, all units of measurement are based on concepts, not physical items. This means that, in principal (not necessarily in practice) anyone can perform a calibration without having to send their equipment to compare it to other equipment. It also means the measurements will never change over time, which is something that will happen to any physical object.

Veritasium has three great videos on how the kilogram went from being defined as a physical object, to being defined as a concept. The first two videos are about how scientists were attempting to use two different methods to acheive this, the last is how they actuall went about acheiving it and how it’s defined today. They are well worth a watch if you’re interested:

Anonymous 0 Comments

Is this why Garrus is ALWAYS doing calibrations?

Anonymous 0 Comments

I had a similar thought about straight edges. How did the first perfectly straight line get created? How did they know it was straight if there was nothing to compare it to… etc etc

Anonymous 0 Comments

There’s one other complication that I think is still worth mentioning for an ELI5 and that is additive errors. Say you have a standard that is calibrated for a second but you then want to use that to calibrate something that is measuring days. Since there is some uncertainty associated with each second of the standard, then you end up with a greater uncertainty in the day measurement that is calibrated against a lot of seconds. Usually that is still more than good enough.

I ran into this when an idiot Quality Assurance Engineer insisted we calibrate a 300 m steel tape by sending it down to a nuke plant where they could calibrate length, traceable back to official standards. But they couldn’t measure the whole thing at once so had to do it in sections so the uncertainties were greater. We didn’t tell him and just ticked the box. We also didn’t tell him we didn’t correct for thermal expansion or the stretching of the tape as you hung it down a well. But as long as are working tapes were calibrated against that one, we were internally consistent and that was what really mattered.

Anonymous 0 Comments

Torque wrench can be calibrated with a bucket of water and a tape measure.

Sure you need to know how much water is in the bucket, so we use a measuring jug. The measuring jug is calibrated using an age old system of measuring by different amounts by either volume, size or weight – but it mostly boils down to the one true kilogram.
I can’t remember where it is in the world but there is a perfect example of 1kg somewhere.

A tape measure also, they’re not accurate anyway. But let’s say we used a micrometer to measure 30cm then it’s usually calibrated using the tool before it lol, and that before the one before etc.

Nowadays with modern machining, these are automated processes and probably most precision tools are ‘just’ machined to spec.

I was watching a documentary about the ancient Egyptian tombs’ sarcophagus and how they enclosed them, and locked them using copper rods it was quite interesting given the time period and ofcourse the crazy thick all up to spec granite they used that fit and slid together like a glove.

I honestly would like to know how they spec that shit back then

Anonymous 0 Comments

Very good book out there by Simon Winchester called The Perfectionists that covers precision, precision instruments and measuring, and how it is all arrived at.

Anonymous 0 Comments

I happen to know of a textbook that covers this exact topic: [Foundations of Mechanical Accuracy](https://archive.org/details/FoundationsOfMechanicalAccuracy). It’s actually quite complex when you get down to it, but the other comments have it essentially right. You need a “Master” calibration tool that is more precise than all the rest of your tools which you can measure against. The book goes into detail on how you can create some of these master tools.

For instance, how do you create a perfectly flat plane from scratch (or as near perfect as can be)? If you already have a master flat plane to measure against, it’s easy — all you do is push your plane against the master and see where they aren’t touching evenly (dye can be used to make this more clear). Once you know where they aren’t touching flat, you can sand your plate down until it does. But how do you make a master plane without a master to reference?

The trick is to make three different flat plates and compare them to each other. Call them A, B, and C. Put A and B together, then sand them down repeatedly until they lie flat against each other, even when rotated 90/180 degrees. They’ll be *mostly* flat, but you can’t be sure that one doesn’t have a depression and the other a bulge. So what you do next is sand down C until it meshes with A. Since B and C both mesh with A, they’ll both have the same bulge or depression. Now you can mesh them with *each other*, and sand both down to get rid of that bulge/depression. If you keep repeating this process alternating between A, B, and C, eventually all three plates will lie flat against each other, and you can be confident that they’re all near-perfectly flat.

Each kind of master requires different tricks like this, but they all boil down to the same idea – gradually calibrate multiple different master versions against each other until they all agree with each other.

Anonymous 0 Comments

It’s important to remember that all measurements have an associated uncertainty. This includes the fundamental definitions for the seven base units such as length, time, mass, temperature, etc.

These base units are only redefined when we find a better method which results in a reduced uncertainty or easier implementation, etc. We just went through this with mass, the last base unit defined by a physical artifact. For a lot of reasons we don’t want base units defined by physical artifacts which can be lost or damaged. Work has been ongoing to redefine the kilogram for many years and just recently a method with better uncertainty and that can be realized by various labs around the world.

At all but the national research labs (NIST, NRC, etc) physical standards are still used – in fact even NIST and NRC use physical standards for most of their day to day work. Weights and measures inspectors for example use various grades or levels of stainless steel and cast iron standards depending upon the level of traceability required. High precision standards will be used to test precious metal scales and lower precision, but still calibrated and traceable, cast iron standards are used to test and calibrate larger freight and vehicle scales.

Canada’s K50/74 prototype kilograms are physical artifacts that are still the primary reference standards for the country. All other mass standards are compared to these standard. It is only when the primary standards need to be tested (they are never adjusted) that the kibble balance (new definition) would be used. Previous to this, the international prototypes were physically carried to Sevres, France for comparison with the international prototypes kilogram (colloquially le grand K)

https://en.m.wikipedia.org/wiki/International_Prototype_of_the_Kilogram

Anonymous 0 Comments

The good news is that, along this chain there are definitely “accuracy multipliers” and forms of natural calibration that are often “good enough” to meet accuracy requirements.

For example: I could measure one foot-pound on a torque wrench with a balanced two-foot bar and a one-pound weight. Both the weight and distance have to be pretty accurate.

But suppose it’s a ten-pound weight and a 20-foot bar, and a mechanism (1:100 gearing) to reduce that torque by a factor of 100. Still one foot-pound, but any inaccuracy on the weight and distance is divided by 100!

Some calibration (when extreme accuracy isn’t needed) is easy. Ice water is always 0C; boiling water is 100C, so there’s probably the most common reference for thermometer calibration.

0 views

Feels like a chicken and egg senario. Let’s say I get my torque wrench from work sent off to be calibrated, and that’s calibrated with something itself needs to be calibrated, and so on and so fourth. How’s that figured out?

In: 430

27 Answers

Anonymous 0 Comments

Each tool is calibrated by another, more precise tool. Eventually you work your way back to a “master standard”, which is some sort of tool or measurement which effectively defines the measurement. For instance for units of length; a factory making rulers creates the marks based on a set of calipers or micrometers. Those calipers or micrometers are calibrated to a set of reference bars which are extremely precisely ground, and certified to be a certain length. The factory which makes the reference bars uses an even more precise set, which comes from a laboratory with a master “inch” measurement. Currently, length is defined based on precise tools for measuring the wavelength of light, so the precision of the light wavelength measurement sets the precision of the everything down the line. In the olden days, there was actually a platinum bar that was considered to be an inch long. It was kept locked in a special vacuum filled safe, and every few years they would take it out and use it to calibrate other instruments.

Anonymous 0 Comments

2147_m has a great explanation. However, one thing I want to point out that they didn’t mention, is that the “highest level” standard you can compare against is not based on a physical object. All measurements are derived from concepts. 1 meter (the standard unit of length, even feet are derived from meters) is defined as the distance light travels in a vacuum in 1/299,792,458 seconds. 1 second is defined as the amount of time it takes the energy level in a cesium atom to oscillate 9,192,631,770 times. And so on, all units of measurement are based on concepts, not physical items. This means that, in principal (not necessarily in practice) anyone can perform a calibration without having to send their equipment to compare it to other equipment. It also means the measurements will never change over time, which is something that will happen to any physical object.

Veritasium has three great videos on how the kilogram went from being defined as a physical object, to being defined as a concept. The first two videos are about how scientists were attempting to use two different methods to acheive this, the last is how they actuall went about acheiving it and how it’s defined today. They are well worth a watch if you’re interested:

Anonymous 0 Comments

Is this why Garrus is ALWAYS doing calibrations?

Anonymous 0 Comments

I had a similar thought about straight edges. How did the first perfectly straight line get created? How did they know it was straight if there was nothing to compare it to… etc etc

Anonymous 0 Comments

There’s one other complication that I think is still worth mentioning for an ELI5 and that is additive errors. Say you have a standard that is calibrated for a second but you then want to use that to calibrate something that is measuring days. Since there is some uncertainty associated with each second of the standard, then you end up with a greater uncertainty in the day measurement that is calibrated against a lot of seconds. Usually that is still more than good enough.

I ran into this when an idiot Quality Assurance Engineer insisted we calibrate a 300 m steel tape by sending it down to a nuke plant where they could calibrate length, traceable back to official standards. But they couldn’t measure the whole thing at once so had to do it in sections so the uncertainties were greater. We didn’t tell him and just ticked the box. We also didn’t tell him we didn’t correct for thermal expansion or the stretching of the tape as you hung it down a well. But as long as are working tapes were calibrated against that one, we were internally consistent and that was what really mattered.

Anonymous 0 Comments

Torque wrench can be calibrated with a bucket of water and a tape measure.

Sure you need to know how much water is in the bucket, so we use a measuring jug. The measuring jug is calibrated using an age old system of measuring by different amounts by either volume, size or weight – but it mostly boils down to the one true kilogram.
I can’t remember where it is in the world but there is a perfect example of 1kg somewhere.

A tape measure also, they’re not accurate anyway. But let’s say we used a micrometer to measure 30cm then it’s usually calibrated using the tool before it lol, and that before the one before etc.

Nowadays with modern machining, these are automated processes and probably most precision tools are ‘just’ machined to spec.

I was watching a documentary about the ancient Egyptian tombs’ sarcophagus and how they enclosed them, and locked them using copper rods it was quite interesting given the time period and ofcourse the crazy thick all up to spec granite they used that fit and slid together like a glove.

I honestly would like to know how they spec that shit back then

Anonymous 0 Comments

Very good book out there by Simon Winchester called The Perfectionists that covers precision, precision instruments and measuring, and how it is all arrived at.

Anonymous 0 Comments

I happen to know of a textbook that covers this exact topic: [Foundations of Mechanical Accuracy](https://archive.org/details/FoundationsOfMechanicalAccuracy). It’s actually quite complex when you get down to it, but the other comments have it essentially right. You need a “Master” calibration tool that is more precise than all the rest of your tools which you can measure against. The book goes into detail on how you can create some of these master tools.

For instance, how do you create a perfectly flat plane from scratch (or as near perfect as can be)? If you already have a master flat plane to measure against, it’s easy — all you do is push your plane against the master and see where they aren’t touching evenly (dye can be used to make this more clear). Once you know where they aren’t touching flat, you can sand your plate down until it does. But how do you make a master plane without a master to reference?

The trick is to make three different flat plates and compare them to each other. Call them A, B, and C. Put A and B together, then sand them down repeatedly until they lie flat against each other, even when rotated 90/180 degrees. They’ll be *mostly* flat, but you can’t be sure that one doesn’t have a depression and the other a bulge. So what you do next is sand down C until it meshes with A. Since B and C both mesh with A, they’ll both have the same bulge or depression. Now you can mesh them with *each other*, and sand both down to get rid of that bulge/depression. If you keep repeating this process alternating between A, B, and C, eventually all three plates will lie flat against each other, and you can be confident that they’re all near-perfectly flat.

Each kind of master requires different tricks like this, but they all boil down to the same idea – gradually calibrate multiple different master versions against each other until they all agree with each other.

Anonymous 0 Comments

It’s important to remember that all measurements have an associated uncertainty. This includes the fundamental definitions for the seven base units such as length, time, mass, temperature, etc.

These base units are only redefined when we find a better method which results in a reduced uncertainty or easier implementation, etc. We just went through this with mass, the last base unit defined by a physical artifact. For a lot of reasons we don’t want base units defined by physical artifacts which can be lost or damaged. Work has been ongoing to redefine the kilogram for many years and just recently a method with better uncertainty and that can be realized by various labs around the world.

At all but the national research labs (NIST, NRC, etc) physical standards are still used – in fact even NIST and NRC use physical standards for most of their day to day work. Weights and measures inspectors for example use various grades or levels of stainless steel and cast iron standards depending upon the level of traceability required. High precision standards will be used to test precious metal scales and lower precision, but still calibrated and traceable, cast iron standards are used to test and calibrate larger freight and vehicle scales.

Canada’s K50/74 prototype kilograms are physical artifacts that are still the primary reference standards for the country. All other mass standards are compared to these standard. It is only when the primary standards need to be tested (they are never adjusted) that the kibble balance (new definition) would be used. Previous to this, the international prototypes were physically carried to Sevres, France for comparison with the international prototypes kilogram (colloquially le grand K)

https://en.m.wikipedia.org/wiki/International_Prototype_of_the_Kilogram

Anonymous 0 Comments

The good news is that, along this chain there are definitely “accuracy multipliers” and forms of natural calibration that are often “good enough” to meet accuracy requirements.

For example: I could measure one foot-pound on a torque wrench with a balanced two-foot bar and a one-pound weight. Both the weight and distance have to be pretty accurate.

But suppose it’s a ten-pound weight and a 20-foot bar, and a mechanism (1:100 gearing) to reduce that torque by a factor of 100. Still one foot-pound, but any inaccuracy on the weight and distance is divided by 100!

Some calibration (when extreme accuracy isn’t needed) is easy. Ice water is always 0C; boiling water is 100C, so there’s probably the most common reference for thermometer calibration.