ELI5, How are precision calibration tools, themselves calibrated?

2.24K views

Feels like a chicken and egg senario. Let’s say I get my torque wrench from work sent off to be calibrated, and that’s calibrated with something itself needs to be calibrated, and so on and so fourth. How’s that figured out?

In: 430

27 Answers

Anonymous 0 Comments

Oh shit! There’s never questions like this that I’m qualified to answer, however this one I can!

As stated by another redditor, there is what’s considered NIST traceability.

What that means is that there is an unbreakable chain of traceability back to the “standard” of measurement that all other measurements that can be derived from start at. This is agreed upon at an international level.

An oversimplification of this is that you imagine somewhere there’s a vault with a perfect block that measures 100 cm in length. (Example, not how it’s actually done)

It’s protected and is what everything that length is measured in is derived from. Inches, meters, feet, kilometers, acres, etc.

Every few years, very high accuracy secondary measuring “standards” are compared against the master standard.

This establishes the first level of traceability.

Each level of measurement down the line from that increases the “uncertainty” of measurement to account for variations in accuracy, human error, etc.

If you have ever seen a zombie or vampire movie, imagine that patient zero is the “master standard” and every zombie or vampire derived from that is a “little less perfect” than that singular top level unit.

For usage as calibration standards, there’s a guideline called the rule of 4 that stipulates when calibrating something, the standard you compare it against is at least 4 times as accurate as the unit under test.

i.e. if you are measuring a ruler that is accurate to 0.1 cm, the standard you compare against should be at least 0.025 cm accurate.

This helps retain that accuracy down the line for long periods of time.

You are viewing 1 out of 27 answers, click here to view all answers.
0 views

Feels like a chicken and egg senario. Let’s say I get my torque wrench from work sent off to be calibrated, and that’s calibrated with something itself needs to be calibrated, and so on and so fourth. How’s that figured out?

In: 430

27 Answers

Anonymous 0 Comments

Oh shit! There’s never questions like this that I’m qualified to answer, however this one I can!

As stated by another redditor, there is what’s considered NIST traceability.

What that means is that there is an unbreakable chain of traceability back to the “standard” of measurement that all other measurements that can be derived from start at. This is agreed upon at an international level.

An oversimplification of this is that you imagine somewhere there’s a vault with a perfect block that measures 100 cm in length. (Example, not how it’s actually done)

It’s protected and is what everything that length is measured in is derived from. Inches, meters, feet, kilometers, acres, etc.

Every few years, very high accuracy secondary measuring “standards” are compared against the master standard.

This establishes the first level of traceability.

Each level of measurement down the line from that increases the “uncertainty” of measurement to account for variations in accuracy, human error, etc.

If you have ever seen a zombie or vampire movie, imagine that patient zero is the “master standard” and every zombie or vampire derived from that is a “little less perfect” than that singular top level unit.

For usage as calibration standards, there’s a guideline called the rule of 4 that stipulates when calibrating something, the standard you compare it against is at least 4 times as accurate as the unit under test.

i.e. if you are measuring a ruler that is accurate to 0.1 cm, the standard you compare against should be at least 0.025 cm accurate.

This helps retain that accuracy down the line for long periods of time.

You are viewing 1 out of 27 answers, click here to view all answers.