Protocols.
I’m assuming you know what a logic gate is and that on the silicon level, computer can take 01010101010110101s and turn them into other 0101010101000101s.
Most of the rest is protocols. We arbitrarily, but globally agreed on what sort of electrical blips mean “hello, I am a USB device” or “hello, I am specifically a USB keyboard” or “hello, the user just pressed the letter K on the keyboard”.
But also “hello, I am a video interface. Tell me what sort of pixels you’re gonna send” the computer then says “ok listen, the next blips Imma send you are 1024 RGB values for a horizontal scanline and 768 RGB values for a vertical scanline and then that will repeat until I say otherwise”. Then the interface says “ok I’m ready lets go” The monitor then has a protocol that listens to those things and turns them into current for the display.
It’s all protocols. Basically, a computer works exactly as you would imagine. It really is an unfathomable number of teeny tiny instructions on how to turn current on and off, and an unfathomable number of unified protocols that govern how input and output devices translate these instructions to think that visually, acoustically or haptically make sense to us.
It is very much like human language. “How can humans get anything done just by flapping their mouths and tongues?” it’s because we agreed on how certain meat flapping means words and sentences and grammar, allowing us to exchange surprisingly complex ideas. Computers have words and sentences that we have taught them, that they can use to talk to their components and to us.
Latest Answers