Explain how a modern computer works at the most basic level.

1.21K viewsOtherTechnology

How do you go from some silicon wafer with some metallic lines printed on it to having a GUI desktop. I don’t understand the intersection between hardware and software at the fundamental level.

In: Technology

31 Answers

Anonymous 0 Comments

A lot of people gave more detailed and accurate answers that try to fully explain everything but I’ll try to explain it at a more basic level.

Computers at very low level act kind of like a light switch. Flip the switch one way for “on”, flip it the other way for “off”. If you had 100 light bulbs, you could have 100 switches, one for each light bulb. But maybe you don’t want to deal with 100 individual light switches, so you start wiring them together. Maybe you set two switches up so that if either switch is flipped to “on”, the light bulb goes on. Maybe for another set if either switch is “off”, the light stays off. You could then take those more complicated sets of switches and combine them to make something like a switch that turns off all of the lights in the kitchen and turns on all of the lights in the living room.

This basically describes a computer, no matter what a computer actually does, at the lowest level it’s just a bunch of switches that are flipping in complicated ways. The ones and zeros everyone always talks about are the light bulbs being on or off. But the key to why anyone can actually program computers has to do what people have done over a very long period of time to build up what building blocks can be used. Some of the earliest computers had literal switches that had to be flipped in certain patterns to tell the computer what to do, but now when you look at a computer screen it’s very difficult to see how exactly everything is being translated to ones and zeros. That’s because there are so many layers of building blocks between what you see and what the actual physical hardware is doing.

Once a computer can do one thing, everyone can reuse that to do more complicated things. So companies that build the hardware will build in a few basic simple actions. You can tell the computer to add two numbers, or check if one number is higher than another, or copy a number from one place or another, things like that. They deal with making sure those actions get translated correctly to stuff that happens in the physical hardware. You can technically write a huge computer program with just those basic operations to directly tell the hardware what to do at all points, but that gets hard to manage pretty quickly.

Instead, higher level building blocks are created using those lower level building blocks. Once someone figures out how to use the add, check, and copy actions to build a loop function that does the same thing over and over until some condition is met, the programmer doesn’t need to know about add/check/copy, they just need to know about the loop. Once someone uses those higher level building blocks, called programming languages, to do something specific like show text on a screen, it can also be re-used. So now someone might not need to know about how to deal with the screen or how a font file works, they just know they have a function called draw text that they can use to do it. These are things like programming libraries, APIs, hardware drivers, etc.

This process of making more and more complicated building blocks and reusing other people’s low level code has been going on basically since computers have existed. So today you might start up a brand new app, but it could be using a programming language that was created 20 years ago built on top of another programming language from 50 years ago on an operating system that is still building off of things created 60 years ago. The key part is that no matter how complicated all of that gets at a high level, in the end it all gets translated into a very long series of simple actions that your computer can do very quickly.

You are viewing 1 out of 31 answers, click here to view all answers.