Explain how a modern computer works at the most basic level.

1.17K viewsOtherTechnology

How do you go from some silicon wafer with some metallic lines printed on it to having a GUI desktop. I don’t understand the intersection between hardware and software at the fundamental level.

In: Technology

31 Answers

Anonymous 0 Comments

Protocols.

I’m assuming you know what a logic gate is and that on the silicon level, computer can take 01010101010110101s and turn them into other 0101010101000101s.

Most of the rest is protocols. We arbitrarily, but globally agreed on what sort of electrical blips mean “hello, I am a USB device” or “hello, I am specifically a USB keyboard” or “hello, the user just pressed the letter K on the keyboard”.

But also “hello, I am a video interface. Tell me what sort of pixels you’re gonna send” the computer then says “ok listen, the next blips Imma send you are 1024 RGB values for a horizontal scanline and 768 RGB values for a vertical scanline and then that will repeat until I say otherwise”. Then the interface says “ok I’m ready lets go” The monitor then has a protocol that listens to those things and turns them into current for the display.

It’s all protocols. Basically, a computer works exactly as you would imagine. It really is an unfathomable number of teeny tiny instructions on how to turn current on and off, and an unfathomable number of unified protocols that govern how input and output devices translate these instructions to think that visually, acoustically or haptically make sense to us.

It is very much like human language. “How can humans get anything done just by flapping their mouths and tongues?” it’s because we agreed on how certain meat flapping means words and sentences and grammar, allowing us to exchange surprisingly complex ideas. Computers have words and sentences that we have taught them, that they can use to talk to their components and to us.

You are viewing 1 out of 31 answers, click here to view all answers.