I was thinking about it for a while: The instruments to produce things and to measure things improve in precision as the technical progress goes on. For example modern machine tools allow us to produce things with tolerance of 1 micron or less. But this machine tool was made with something And that thing was made with something too. And since the tech level was lower back then you’d have to make a more precise machine tool using a less precise one. How is this possible?
In: Engineering
If you go look at the hinges on your kitchen cupboards, you’ll find that they have a few screws that allow you to adjust the hinge up and down, inwards and outwards. The holes they drilled to screw in the hinges can be fairly sloppy because you then have those adjustments to make sure the doors line up. That sort of adjustment exists in most machines, which allows you to fine tune your machine to a better tolerance than it was produced to. That’s crucially important: the individual pieces are produced to a lower standard than what you achieve at the business end of the machine!
Whatever adjustments you made, though, you still have some amount of imprecision. Where is that imprecision coming from? Perhaps you have way too much vibration somewhere and you need some way to dampen it with a spring, or add bracing to the machine so it becomes more rigid. Perhaps the problem comes from the way you’re operating the machine and you can change how you use it. Or maybe you just identify which components have the worst tolerances, and then you fabricate however many you need to get a “perfect” one.
Sometimes, improvements come from completely different schools of engineering. Car engines benefitted massively from electronics becoming cheap enough for electronic fuel injection. Sometimes pure science gives you a breakthrough. All sorts of instrumentation became much more precise when lasers became available.
Latest Answers