eli5 How do precision tool manufacturers get their first calibrations

215 views

I worked a lot with torque wrenches in the Army, and always wondered how the “first” torque wrench was calibrated without another one to verify that it was accurate? Was there another tool to verify the calibration was correct and if so how did that one get calibrated. In my head it keeps asking “well how did the next one get calibrated?” Every time I think about the first precision tools of any type, not just for torque.

In: 8

8 Answers

Anonymous 0 Comments

There are indeed “levels” of calibration, from the tool you’re handling, through its manufacturer, probably through a contractor that does their calibration etc., until you follow the trail far enough and end up in a physics lab where someone is testing some measurement device against a stadard definition of a unit.

It used to be particularly fun when some of those “definitions” were literal physical objects, like the standard meter and kilogram. And there was a whole routine where the various physical standards around the world (there were multiple copies distributed around) were measured and compared for changes. And changes DID occur, because a physical object is a physical object, but it was though luck, because the unit was whatever the object was. So whatever that change was, everything was calibrated to say “this is a kilogram/meter now”.

That’s why all the SI units have since been redefined in terms of basic physical constants. But those are still only possible to measure to whatever precision is currently available, and so are constantly refined.

Anonymous 0 Comments

We have created the units we use. As long as you have a definition that can be repeatably used to measure it you can use it to calibrate other devices. All of the units we use have been arbitrarily defined. So what is relevant is that everyone uses the same definition what the definition is.

Torque is rotational force over a distance messure in newton-meter or foot-pound. So you need to be able to measure the distance from the center of rotation and the force you apply there.

The distance is more obvious in how it works.

Force units have historically been based on the weight if a specific mass at the surface of the earth. Because the earth’s gravity is not exactly the same everywhere measuring it very accurately is a bit tricky to get very exact, the total variation on the surface of the earth is only 0.7%.

So a simple way to calibrate the to some degree of accuracy is to mount the horizontal and suspend a known weight at a know distance from the rotational center. The torque is easy to calculate. You can change the distance and/or weight to the correct torque and use it to calibrate the measurement part on the wrench.

A guess is a calibration like that can without a lot of problems be accurate to a few percent and will be the same as most torque wrenches

Anonymous 0 Comments

Depends on what it measures, but at the end of the chain we have unit definitions.

We determine the length of a meter based on time and the speed of light. The length of a second by the oscillation of a specific atom. Force is based on mass (from the kilogram) and velocity.

We create flatness using the grinding of three plates together.

Geometric shapes are formed by geometric definitions, like flat plates through grinding, while units are “formed” from principles of physics.

Anonymous 0 Comments

A calibration is made against a “standard” – a standard is an item designed and verified to be accurate by comparison to a higher standardx and ultimately up to a primary standard.

Historically, primary standards were physical objects. For example the length of 1 metre used to be the length of a stick kept at a Paris laboratory. Periodically the top labs in each country would send their sticks to Paris to be checked against the primary.

These days, primary stands tend to be scientific experiments. For example, a metre is now defined as the distance travelled by light in a specific period of time. A laboratory can do the experiment to time a beam of light against an atomic clock and that gives the distance. If the experiment is done properly, the length will be exact to the accuracy of the experiment.

This experiment can be difficult to do, so only a few laboratories in each country will have the equipment do it. They will then have sticks which act as secondary standards which can be used for calibrating other sticks, and so on.

Anonymous 0 Comments

WRT torque, torque is defined as a force at a given distance from a rotation point

So 20 ft-lbs is equivalent to hanging a 20 lb weight 1 ft away from the rotation point, assuming a massless bar doing the connection. So you need a way to account for the mass of the metal bar that you use to connect the weight and the rotation point.

Anonymous 0 Comments

If you go back far enough, you just find that the units are arbitrary and were invented by a person or committee. Originally those “master” measurements were kept in the most precise way they knew how, such as with a platinum weight for the kilogram, or rod of particular length for the meter. These measurement standards have been replaced by observable physics, for instance the meter is now defined as the distance light travels in 1/299,792,458 seconds. This is easy to replicate with the right precision instruments, but is still arbitrary. If the original meter had been defined as twice as long, the modern definition would be the distance light travels in 2/299,792,458 seconds.

Anonymous 0 Comments

For a fascinating look into the history of precision, try The Perfectionists by Simon Winchester.

Anonymous 0 Comments

Because a split second before the torque wrench was applied to the faucet handle, it had been calibrated by top members of the state and federal Departments of Weights and Measures, to be dead-on balls accurate. Here’s the certificate of validation!