I’ll try and fill the gaps that I feel aren’t quite being addressed yet. We’ll use a basic microcontroller as an example, the kind you would find in consumer electronics.
A microcontroller is basically a large amount of tiny transistor switches crammed into a miniature circuit. We control these in an on/off manner, which in actual reality, is defined by voltages – 2.0 to 3.3V might count as ‘on’, and 0 to 1.5V might count as ‘off’. The unaccounted gap in the middle is too ambiguous and close to either of them, so we avoid it. To make this useful in a logical sense, we decide to name high/low as 1 and 0 in the digital world. At its core, it’s really all there is to it.
If you do a rough overview of digital logic, you’ll see how you can start arranging your 1’s and 0’s into basic logic gates, just based on chaining switches together. If you’re a Minecraft person, redstone logic is a really interactive way of seeing this. It seems basic at first (what the hell do you do with an AND or OR gate?), but when you take into account that a CPU such as an Intel i9 has around *10 billion* switches inside it, you can see how you can start to turn it into something useful.
Then, we came up with ways to interpret all the 1’s and 0’s. Basically, we pick which set of ‘rules’ we agree on, and then we can interpret it as information. Like Morse code – they’re all dots and dashes, but we made up a set of language rules to turn them into letters. This applies to how we send information over connections, how fast we are sending it, how to interpret numbers and characters, and basically everything else you can think of. Everything comes down to these high and low voltages, and what we’ve decided it all means.
If I send info from my PS5 from my TV, we’re agreeing that we are using HDMI to do so. HDMI is a standard that tells us exactly how those two things should be connected, and exactly what those 1/0 signals represent. The PS5 has it’s own software to make decisions on what to send, and the TV has its own software on what to display on its screen. The fact that we have a universally agreed upon set of rules, is the only reason that sending random voltage signals means anything to either of them. If they aren’t on the same page, it’s suddenly meaningless. Just like if we use the same letters, but one thinks we are talking French, while the other is expecting Italian. There are many different variations of ‘language’, both in what the data means, but also the physical way they are connected and controlled. USB protocols are an incredibly common one too.
A microcontroller has memory, so it can store 1’s and 0’s that it needs to park temporarily (or long term), for use later. It also has control of pins, which allow us to connect it to other things. According to its instructions, it can check to see if something else has made this pin a 1/0, or it can set that pin to 1/0.
The magic that ties it all together is the programming. We write the behaviour of the device and the decisions it’s making, as a complex set of checking and changing values. For example, we might say ‘check pin 1 -> if it’s high -> set pin 2 to high’. Pin 1 might be a button, pin 2 might be an LED. So now we have a program that turns on a light when you press a button. Write up a very complex version of this sort of process, and this is how we tie together all of the different inputs/outputs and decisions made based on those. When we write code on our computer, some very fancy compiler software turns it from the human readable form that we’ve created, into a set of instructions specific to that machine. This is going right back to basics and fundamentally controlling all of those billions of switches in a way that actually matches our code
Latest Answers