Young machinist here. How did we make precise instrument from something supposedly less precise and how did we calibrate it? We have machines that can machine things within .0001 inches or less of tolerance. Wouldn’t that require the machine to be made with at least the same precision? This applies to the measuring equipment as well. In order for say a micrometer or CMM to measure down to .0001 or even .00001 of an inch, how did we ensure those firsts were accurate?
In: 4
Forty threads per inch, 40TPI.
so…if you spin the threaded item 4 times, it measures 1/10th of an inch. Then, around the barrel, you make 25 even marks. Four times 25 = 100. So, each hash-mark is 0.001″
One full spin is 0.025″, right? so, four spins is 0.100″…so 40 spins is 1.000″
This is why a micrometer uses 40TPI…
Latest Answers