How did we make more and more precise measuring equipment?

290 views

Young machinist here. How did we make precise instrument from something supposedly less precise and how did we calibrate it? We have machines that can machine things within .0001 inches or less of tolerance. Wouldn’t that require the machine to be made with at least the same precision? This applies to the measuring equipment as well. In order for say a micrometer or CMM to measure down to .0001 or even .00001 of an inch, how did we ensure those firsts were accurate?

In: 4

5 Answers

Anonymous 0 Comments

There are processes that will get you something more precise than the tools you started with and one example sticks out in my mind

I’m taking this from a video that I’ll link at the bottom.

You’re in the wilderness and don’t have any tools. There are lots of rocks around, though. You find 3 of them that have approximately flat faces. They can still be bumpy, just need to be flat enough to rub against each other.

You start by rubbing a pair of them together to make both surfaces a little flatter. Then switch it up and use a different pair, then the last pair, then rinse and repeat

If you only used 2 rocks, you would eventually get 2 surfaces that fit together snugly but aren’t flat. Bumps in one would be valleys in the other. But with 3 rocks, a flat surface is the only way each rock will fit snugly against each other rock.

So after much rubbing, you’ll be left with flat surfaces. You’ve probably got one of these in your shop that’s calibrated. Having a known flat surface lets you make more complicated stuff. And you didn’t even need a tool with a rated precision!

You are viewing 1 out of 5 answers, click here to view all answers.