The exact answer is different for each of those things, but it seems like the basic question is how did we manage to get more processing power on smaller devices.
The simple answer is we managed to find ways to fit more transistors (things which tell electricity what to do) onto computer chips. Some of that was by finding new materials to use for the transistors, some of it was by using things like lasers and UV light to add things far to small for any hand or machine to add manually.
There’s a famous law in computing called Moore’s Law, which (paraphrased) states that the number of transistors you can fit per inch doubles every 18 months. That was changed to processing power doubles every 18 months along the way because the number of transistors isn’t the only thing being changed, nor is it necessarily the most important, and at this point it’s more of an aspirational goal that chip makers look towards than a law or prediction.
In short, we used new materials like germanium and silicon, along with new techniques like using light to “burn” things onto the chips instead of physically putting them there, which let us fit more and more stuff into the same space.
Latest Answers