ELI5, How are precision calibration tools, themselves calibrated?

384 views

Feels like a chicken and egg senario. Let’s say I get my torque wrench from work sent off to be calibrated, and that’s calibrated with something itself needs to be calibrated, and so on and so fourth. How’s that figured out?

In: 430

27 Answers

Anonymous 0 Comments

A way of manufacturing a precision device is created then other devices using a similar mechanism can be calibrated. A device is made that derives it’s calibration from some universal element. Another device is then calibrated against this. If the original device is inaccurate then sometimes the error can he calculated very accurately or the average of a number of readings from a number of devices are taken.

Anonymous 0 Comments

The key to calibration is traceability. If I buy a calibrated tool, the company that calibrated it used their local standard. In the paper work they provide me, they include provenance showing when their local standard was last calibrated, and what it was calibrated to. In the US, these chains of provenance typically lead to a standard approved by the National Institute of Standards and Technology, a part of the US government.

Anonymous 0 Comments

It used to be that we had physical objects that were by definition calibrated. Whenever we wanted to calibrate an instrument to the highest degree so we could use it to calibrate other instruments we would use this object to calibrate the instrument and by definition the object would be calibrated.

The issue with this was that these objects changed slightly over time. So the definition changed. For example when the master kilogram lost a tiny bit of weight over some time then the definition of the kilogram changed and all the calibrated instruments were now out of calibration.

In order to fix this we are now using the resault of carefully chosen experiments. So you can measure a physical property, such as the speed of light, and you will by definition know the resault, if you get a different resault it is because your instrument is out of calibration so you need to have it adjusted.

Anonymous 0 Comments

With something like a torque wrench, you can calibrate it with an ordinary scale/balance. Multiply the force applied to the wrench by the length of the lever arm to get the torque, then compare that to the setting on the wrench. Don’t forget to account for the torque from weight of the wrench itself.

Anonymous 0 Comments

at the lab i work at we have a guy from the manufacture(s) come in a calibrate our tools. every 6mo or year depending on tool.

Anonymous 0 Comments

in my college electronics classes measurement tools were calibrated with other precision tools. its called metrology. you have to use a tool thats a certain amout more accurate than the one youre calibrating and so on and so forth. in college we had little handheld old school analog meters. we also had the machine to calibrate them that took up an entire table and was heavy as hell.

Anonymous 0 Comments

Same problem here….how programing language (c++, java etc) are programmed in first place?

Anonymous 0 Comments

The SI system is what defines meters, seconds, kilograms, Newtons etc. Your wrench may use foot-pounds-force, but feet and pounds-force are nowadays defined as certain numbers of meters and Newtons.

The SI system is defined in such a way that scientists can carry out experiments to get the length of a meter etc precisely.

For example, a particular atom in a particular state will give off radio waves with a specified number of waves per second. By counting waves, you can measure a second. And this is not an approximation, the second is defined as the time it takes for a certain number of waves.

Another example: Light and radio waves travel through vacuum at a fixed speed, which is specified in the SI standard. Using an accurate clock (see previous paragraph), you can measure how far light goes in a certain fraction of a second, and that is a meter.

Of course, all the above is completely impractical for day to day use.

So there are a small number of labs worldwide, typically one per country, which specialise in measurement. They will have a number of standards, such as 1kg lumps of metal or metal sticks with two marks precisely 1m apart. Those standards will have been checked by the experiments above, or against standards that were calibrated against those experiments. (E.g. There are only a handful of labs that have done the kilogram experiment).

In turn, those standards will be used to calibrate other standards or measuring devices, which will be used to calibrate other standards or measuring devices, and this repeats many times until one of those calibrated devices is used to calibrate your wrench.

Each time you calibrate something you end up with less accuracy than you started with. But your wrench probably doesn’t need to be accurate to one part per million, even one part per thousand is probably overkill.

Anonymous 0 Comments

Super captivating BBC Documentary about measurments and weight standards. They know how to tell a story. https://www.youtube.com/watch?v=XofuloR6x74

Anonymous 0 Comments

A calibration standard is, in general, calibrated to a better quality standard at a higher laboratory with better comparison equipment. However, at some point, there has to be a top laboratory with a reference standard which is the end of the chain.

Historically, this was with special specimens kept in very careful conditions, which were carefully built. For example, for many years, a laboratory in Paris kept a stick with two marks 1 meter apart engraved on it, and this was the reference meter. Another laboratory might get a stick and put two marks on it – but it would then have to be shipped to Paris, and measured against the reference meter stick. The lab would then key a record of the exact length.

These days, measures have been redefined to something fundamental which you can measure with a scientific experiment. The official meter is no longer the length of a stick in Paris, but there is an equation for the length of a meter as compared to the result of a scientific experiment. For example, top calibration labs don’t have use sticks as their top reference any more. Instead, they have a scientific apparatus which can perform a laser spectroscopy experiment which allows the time it takes for light to travel a certain distance to be measured. The lab can put a stick in the apparatus, and it will be able to give the exact length based on the equation and the result of the experiment.

Similarly, the second used to be defined as a fraction of the length of the day. A calibration laboratory would do an experiment to measure the height of the sun, and they could compare a clock to when the sun reached it’s highest point in the day marking noon. These days, the second is now defined as a multiple of the frequency of a specific transition of a cesium atom. This transition frequency can be measured by microwave spectroscopy, and you can compare a clock to the transition, and you can adjust the clock as needed. In fact, you can go out and buy an atomic clock, which is just a good quality clock, packaged with a spectroscopy apparatus and an auto-adjust system which checks the clock against the spectroscopy apparatus hundreds of times per second and adjusts the clock as needed.

0 views

Feels like a chicken and egg senario. Let’s say I get my torque wrench from work sent off to be calibrated, and that’s calibrated with something itself needs to be calibrated, and so on and so fourth. How’s that figured out?

In: 430

27 Answers

Anonymous 0 Comments

A way of manufacturing a precision device is created then other devices using a similar mechanism can be calibrated. A device is made that derives it’s calibration from some universal element. Another device is then calibrated against this. If the original device is inaccurate then sometimes the error can he calculated very accurately or the average of a number of readings from a number of devices are taken.

Anonymous 0 Comments

The key to calibration is traceability. If I buy a calibrated tool, the company that calibrated it used their local standard. In the paper work they provide me, they include provenance showing when their local standard was last calibrated, and what it was calibrated to. In the US, these chains of provenance typically lead to a standard approved by the National Institute of Standards and Technology, a part of the US government.

Anonymous 0 Comments

It used to be that we had physical objects that were by definition calibrated. Whenever we wanted to calibrate an instrument to the highest degree so we could use it to calibrate other instruments we would use this object to calibrate the instrument and by definition the object would be calibrated.

The issue with this was that these objects changed slightly over time. So the definition changed. For example when the master kilogram lost a tiny bit of weight over some time then the definition of the kilogram changed and all the calibrated instruments were now out of calibration.

In order to fix this we are now using the resault of carefully chosen experiments. So you can measure a physical property, such as the speed of light, and you will by definition know the resault, if you get a different resault it is because your instrument is out of calibration so you need to have it adjusted.

Anonymous 0 Comments

With something like a torque wrench, you can calibrate it with an ordinary scale/balance. Multiply the force applied to the wrench by the length of the lever arm to get the torque, then compare that to the setting on the wrench. Don’t forget to account for the torque from weight of the wrench itself.

Anonymous 0 Comments

at the lab i work at we have a guy from the manufacture(s) come in a calibrate our tools. every 6mo or year depending on tool.

Anonymous 0 Comments

in my college electronics classes measurement tools were calibrated with other precision tools. its called metrology. you have to use a tool thats a certain amout more accurate than the one youre calibrating and so on and so forth. in college we had little handheld old school analog meters. we also had the machine to calibrate them that took up an entire table and was heavy as hell.

Anonymous 0 Comments

Same problem here….how programing language (c++, java etc) are programmed in first place?

Anonymous 0 Comments

The SI system is what defines meters, seconds, kilograms, Newtons etc. Your wrench may use foot-pounds-force, but feet and pounds-force are nowadays defined as certain numbers of meters and Newtons.

The SI system is defined in such a way that scientists can carry out experiments to get the length of a meter etc precisely.

For example, a particular atom in a particular state will give off radio waves with a specified number of waves per second. By counting waves, you can measure a second. And this is not an approximation, the second is defined as the time it takes for a certain number of waves.

Another example: Light and radio waves travel through vacuum at a fixed speed, which is specified in the SI standard. Using an accurate clock (see previous paragraph), you can measure how far light goes in a certain fraction of a second, and that is a meter.

Of course, all the above is completely impractical for day to day use.

So there are a small number of labs worldwide, typically one per country, which specialise in measurement. They will have a number of standards, such as 1kg lumps of metal or metal sticks with two marks precisely 1m apart. Those standards will have been checked by the experiments above, or against standards that were calibrated against those experiments. (E.g. There are only a handful of labs that have done the kilogram experiment).

In turn, those standards will be used to calibrate other standards or measuring devices, which will be used to calibrate other standards or measuring devices, and this repeats many times until one of those calibrated devices is used to calibrate your wrench.

Each time you calibrate something you end up with less accuracy than you started with. But your wrench probably doesn’t need to be accurate to one part per million, even one part per thousand is probably overkill.

Anonymous 0 Comments

Super captivating BBC Documentary about measurments and weight standards. They know how to tell a story. https://www.youtube.com/watch?v=XofuloR6x74

Anonymous 0 Comments

A calibration standard is, in general, calibrated to a better quality standard at a higher laboratory with better comparison equipment. However, at some point, there has to be a top laboratory with a reference standard which is the end of the chain.

Historically, this was with special specimens kept in very careful conditions, which were carefully built. For example, for many years, a laboratory in Paris kept a stick with two marks 1 meter apart engraved on it, and this was the reference meter. Another laboratory might get a stick and put two marks on it – but it would then have to be shipped to Paris, and measured against the reference meter stick. The lab would then key a record of the exact length.

These days, measures have been redefined to something fundamental which you can measure with a scientific experiment. The official meter is no longer the length of a stick in Paris, but there is an equation for the length of a meter as compared to the result of a scientific experiment. For example, top calibration labs don’t have use sticks as their top reference any more. Instead, they have a scientific apparatus which can perform a laser spectroscopy experiment which allows the time it takes for light to travel a certain distance to be measured. The lab can put a stick in the apparatus, and it will be able to give the exact length based on the equation and the result of the experiment.

Similarly, the second used to be defined as a fraction of the length of the day. A calibration laboratory would do an experiment to measure the height of the sun, and they could compare a clock to when the sun reached it’s highest point in the day marking noon. These days, the second is now defined as a multiple of the frequency of a specific transition of a cesium atom. This transition frequency can be measured by microwave spectroscopy, and you can compare a clock to the transition, and you can adjust the clock as needed. In fact, you can go out and buy an atomic clock, which is just a good quality clock, packaged with a spectroscopy apparatus and an auto-adjust system which checks the clock against the spectroscopy apparatus hundreds of times per second and adjusts the clock as needed.