Programming code tells the computer to do something, but what makes the code actually mean anything to the computer?

1.20K views

Programming code tells the computer to do something, but what makes the code actually mean anything to the computer?

In: Engineering

36 Answers

Anonymous 0 Comments

At the end the compiler takes the english written computer code and turns it into assembly code. These are series of instructions that the core processing unit understands in a way that it can send out electrical signals to specific parts of the computer. 1 means 5V 0 means 0V. To handle the logic there are things called flip-flops (yes, funny name). These are actual electrical circuts that handle the basics of logic. Even you can easily build one as they are really simple. (Few wires, switches)
The thing is that you need to make these extremely small so that a small cpu can be powerful enough in such a small device.

Anonymous 0 Comments

Read this all in Jeff Goldblum’s voice, because I’m watching his stupid Disney documentary so that’s how I wrote all this. I think it will be the most fun but accurate to read without being overwhelming out of these answers. If English is not your first language, let me know and I can post it again but more simple and still fun.

Good lord, there seems to be a bunch of computer scientists in here. These answers are generally all right but I don’t think they get down far enough to the physical level of what you’re wondering. You’re asking for bottom up but you’re getting top down. Electrical guy here. I’m not a very good electrical engineer but I would be worth dogshit if I couldn’t explain how we get from atoms to Youtube videos. My electronics professor loved to say “digital is just a special case of analog”. So with that in mind:

The basic building block of all electrical circuits is the **transistor**. If you hear about “tubes” with old electronics, originally transistors used to be giant glass tubes with the air sucked out. Today we have **MOSFETs** — Metal Oxide Silicon Field Effect Transistor. You don’t need to know what it means, it’s just fun to know, like knowing what DNA stands for.

MOSFETs are tiny tiny tiny microscopic pieces of silicon and other stuff. Very very complex. Centuries of engineering and math and physics and chemistry and quantum mechanics and manufacturing and ingenuity have all led up to it. It has to do with molecules with too many electrons, and molecules with not enough electrons, and then building a sort of, I guess, cake out of them. Layers and pieces constructed just right to do what you want at a microscopic level so electrons move as you guide them.

The entire modern world is made of these, built on clever tricks by an Egyptian and Korean American scientist at AT&T’s Bell Labs in just 1959. Not that long ago really, Hayao Miyazaki and Bernie Sanders wouldve been 18 around then. What’s nice is that these guys (the scientists, not Miyazaki) had never worked with this stuff before but quickly gained a knack and somehow solved problems scientists could only dream of. Never underestimate a fresh pair of eyes. Look up Heaviside or Ramanujan when you get the chance.

A transistor is a switch, but instead of needing to press or flip it physically you apply a voltage, which is like water pressure for electricity. You use electricity to control electricity. Think of it like a stop light. It’s actually almost exactly like a stop light. You make the light red (low voltage), traffic stops. That’s a “0” (cutoff for the MOSFET). You make the light green (high voltage) and traffic goes. That’s a “1” (saturation for the MOSFET). Yellow light puts the MOSFET in linear mode, which has its use, but not for computers so we’ll skip it. Pretend they’re the cars from the Bee Movie, they stop and start perfectly. It’s a metaphor whatever.

Now it turns out, you can hook up a few transistors together and make some neat logic happen, what we call Boolean logic. Like okay, traffic can only go if this light is green AND this light is green. We’re not thinking in voltages and currents anymore, we’re thinking in 1’s and 0’s already.

Let’s go further, more abstract. If you get a 0 here, a 1 there, we want this to happen. Or a 0 here and then a 1 at the same light and then another 1 and another 1 but then a 0, we want this specific thing to happen.

You can use this idea to build simple blocks, like addition and subtraction. Soon multiplication, or shifting various things. You put all this functionality into something called an **ALU. Arithmetic Logic Unit**. You have to store the numbers for it to use somewhere so that we can do more than one calculation at a time. You build memory, which is transistors feeding themselves so they stay put. You devise a system so you say okay if this is 1 and this is 0, that means add the numbers. Which numbers? Oh well check these switches, if it’s 111, pull it from here, if it’s 110 pull it from here. That is called **machine language**.

We don’t want to keep saying 1’s and 0’s. What a pain. What if we could assign a mnemonic to each operation? Shorthand for what you want it to do, what you want it to be done to, and where to put it after? That is called **assembly**.

Soon machine language also became a pain. We’re still dealing with circuits and transistors without saying we are. We’re fooling ourselves. We’re babbling. Instead of instructing word by word, let’s make sentences. Why not? Or hell, pictures. We’ll make a dictionary for a language we speak (English of course because much of this is built on work by Americans and British, sorry), and a manual that says how to use the dictionary. That dictionary is a **programming language**, and that manual is a **compiler**. You build an **operating system** that handles stuff like talking to the CPU directly and moving memory around so you can focus on higher level apps like a video game.

And now, we are where you are. Writing programs in a language using words from that dictionary. But in the end you’re applying voltages to create conductive channels on a photolithography substrate. The words you type make trillions of molecules move *exactly* the way you specified at literally light speed without ever needing to think of them, so someone can send a funny joke to their friend.

Anonymous 0 Comments

OK let’s try philosophy for a 5yo. 😀

I think to really understand what things “mean” you need to know what „meaning“ (in your question) means for your computer.

Of course there is bits and bytes and CPUs and machine codes and programming languages and Fortnite and so on, but that still does not really tell what things “mean”.

*** NOTE: There are some differences between how a “real” computer works and what I describe here. I will lay out these in the end. I have good reasons! 🙂 ***

I think the missing piece to what others pointed out (correctly) is: what can computers actually DO? Because this is what the „code“ is; it’s instructions to DO something. And this is the only “meaning” things have for a computer: the code tells it to execute operations. What these are, what they lead to – well, the computer does not know, of course.

But what are those commands? What does a computer do? Even the most advanced hardcore gaming PC is, in the end, a very simple machine when it comes to do things. Let’s create a list of what it can do:

1. it can store natural numbers. So you can, simply put, have a place in the computer that has a name, and that you can put numbers in. Let’s call those places variables, and they have names: a,b,c,d, …

2. you can set those variables to a number you like: a = 1337

3. you can add or subtract a constant value from or to a variable: a = a – 10

4. A loop: you can repeat a set of commands. The loop stops when a variable you specify is 0.

This. is. it. Not too bad, is it?

And your code tells the computer to do one of those things, one after another. That is the meaning of code.

Now there are two more immediate follow-up questions, and I’m afraid we need to grow up a bit now (I’m not clever enough to describe this simpler).

1. I don’t believe you. If this is all, why the heck can we play Fortnite in HDR?

The pixels you see on your screen are represented by variables. So basically, if your screen is black to start with, to set one pixel to white, you need to figure out which variable does it, and set it to the number that represents white. The computer has no idea what it’s doing, has no concept of white, so this is the only meaning it has to understand. This is true for every function your computer has that does not seem to have anything to do with numbers: for the computer, everything is just a number. We provide the meaning with output devices and by experiencing the output.

2. This still sounds wrong. All computer languages I know have waaaaaaay more commands than three.

True. This is where what others here said comes in. Building Fortnite only with those three commands would be tedious, may be even impossible as we would have a very hard time to read and understand what’s so easy for the computer to run.

So we build layers of complexity on top of this:

The CPU (the thing that runs your computer) itself already has a ton of helper functions. For example, the three commands do not allow you to add one variable to another, you need a small program with multiple commands! How tedious. So, the microcode of the CPU includes this program and you can simply say c = a + b. (not completely true, see far below – but it illustrates microcode). There are a lot of these. Some are build physically with wires and transistors, some are actual little programs within the CPU; the difference does not really matter, because: they still do not create more “meaning” for the computer; and all it does in the end is executing commands 2-4.

The CPU commands have numbers as names (as they address variables that contain the little programs!) – this is super tedious. So we gave them names and wrote a program (Assembler) that converts names humans can remember to those numbers. So instead of saying 16 15 12 (add (command 16) the number in variable 12 to the number in variable 15) you write add b,a and the Assembler will translate.

This is still super tedious, so we build programming languages with really high level functionality. Suddenly, numbers can be anything: they become text, they become 3D objects, they become network packages being send around the world. But this is all meaning added outside of what the computer “is”.

Even after years of computer science I find it fascinating that these four basic commands are all that you need to build… all this! (theoretically)

========

*** Real computers and theory ***

What I described are the theoretical capabilities of Turing machines (actually, WHILE programs, but they are equal in complexity, in what they are capable of). Read up on this stuff, it’s brillant.

Computers can do less than these machines, because computers have limited memory. Computers also do work differently physically due to this fact.

I wanted to add this to minimize yelling and shouting, because obviously numbers are not added in loops by a CPU, but by using parallel addition units.

So, modern CPUs can physically add two numbers directly without the need for a loop, similar to how we add numbers manually:

435 + 123 = (4+1) (3+2) (5+3) = 558

One might say, hey but this is better than our three commands! It adds something! But: this only works because the maximum value for each variable is fixed; if you add numbers bigger than what you planned for when you build your computer, it breaks apart; the size is limited, so you KNOW how many digits to add BEFOREHAND. For possibly infinite numbers you would still need to loop to iterate over all digits of the number until you are done.

Also there are specialized units that are build with special mathematical purposes (floating point operations, matrix operations). They work a bit differently when it comes to how to interpret variables, but the available complexity is, in the end, the same.

Furthermore, when I say “the CPU in the end only RUNS those three commands”, this is not meant literally. Obviously the CPU works differently. But from the information theory standpoint, even while working differently, it can not do more than what you CAN do by running those three commands. I abbreviated this for the sake of a potentially 10yo.

I have the feeling that we now have reached adulthood and should leave it like that. But I hope this little trip into theory was worthwhile, and I also hope I did not mess up too many theoretical terms, as English is not my main research language. Plus cellphone.

Anonymous 0 Comments

Finally one I’m qualified to answer.

Computers are made from electronic components. What these components are, has been changing as we progress in technology, but one thing they all have in common, is they only work on electricity (duh). But what this means is that our numbers and languages mean nothing to a computer, as we cannot convert them to electricity. But if we look at it, the electricity can also be mathematically modeled. This might sound complex, but it’s actually pretty intuitive, let’s say a switch, it will either be on, or be off. We can easily represent this as 1 for “on” and 0 for “off”. Technically we call it ‘high’ for 1 and ‘low’ for 0. This is because we don’t completely turn off the so called ‘switches’ of the electronic components inside the computer, but have two different levels of power, a high power (1) and a low power (0). Now what we want the computer to do is then just making a combination of these ‘components’ so that when we turn on a certain set of switches (which are our inputs) a certain set of bulbs will light up at the other side (which are our outputs) e.g if we have two inputs A and B and an output C, where A and B are the ‘switches’ and C is the ‘bulb’. We can create a circuit, where the bulb turns ON only when both A and B are ON. This means that C is 1 only when A and B are both 1. Thus we have successfully created what is called an **AND GATE.** There are many other such gates that we can use to create all different kinds of circuits to give a desired output for a certain set of input.

**What you read above was how MACHINE LANGUAGE works. The actual language that a computer understands.** In the early stages of the computer, this is exactly how we used to interact with computers, through 1s and 0s, but as we wanted to do more complex tasks with computers, it was tedious to type in 1s and 0s. So that’s when we decided to come up with short codes for all the long 1s and 0s, e.g we decided that ‘1000001’ means the letter A means , and ‘1000010’ means ‘B’ and so on.

Now the languages we use today, think of them as a pyramid. On the bottom is the machine language, and on the top are “high level languages” like Python and Java. The higher the language, the easier it is for us to understand. The lower the language, the easier it is for the computer to understand. So accordingly, we have different set of translators for different sets of languages depending on where they lie on the pyramid. For the highest level languages, the translators are equally complex, whereas for machine language, we don’t need a translator.

The rest is just a job of programmers creating this translator, called a ‘compiler’ or an ‘interpreter’ so that it translates the code we write to what we want it to mean in the corresponding machine language instruction

Anonymous 0 Comments

Electricity moves along a copper wire. When you connect a copper wire to two opposite ends of a battery, electricity will flow through the wire. When electricity flows through a wire, we call this state “1”, when electricity is not flowing; this is state “0”. A computer is really a circuit board. A circuit board is a permanent and solid thing. It is rectangular and etched into its surface are copper wire. These wires transport electric signal. .

So a computer is just a circuit board. A circuit board just transports energy. There is no understanding. A compiler just transforms written words into electrical signals that are fed to a circuit board, like on a production line.

Anonymous 0 Comments

**A computer is basically a HUGE electric “marble race” game, with the track shaped by the program.**

You start with logic gates. LOTS of them. The power tumbles in lots of different ways according to the way the “track” is set up, and after a lot of applied ingenuity you get it to actually do something useful.

What do I mean by a logic gate? OK, let’s say you design a tiny electronical circuit that has two wires (A & B, say) going in and one wire (C) coming out. And that it’s designed so that there’s an output signal if (and only if) it’s getting at least one input signal. If both are missing, nothing comes out. That’s basically a logical OR – C gives a signal if and only if it gets one from A OR B.

Then you design another that gives an output when, and only when, it has no input on a specific wire. That’s a NOT operation.

You can go on to work out how to build basic circuits that do other logical operations, but with OR and NOT you’ve now already got enough circuit designs to combine to give all the possible logical operations – AND, XOR and so on. [You can make an AND with three NOTs and an OR, for example: feed each of A and B into their own NOT gates; feed the two outputs into an OR gate; feed the output from that into the third NOT gate; what comes out is basically A AND B. Proof left, as they say, as an exercise.]

So now you start working out how combine all the gate types into bigger combinations that do something more interesting. You can combine them to tweak a set of electrical values that represent a number so that you effectively add 1, for example. So you keep going, building the complexity. What you’re aiming for is a heap of circuits that will copy the values from one place to another, then do interesting things based on what those values are, then move on to the values in another place, and so on. And those values that make it do the different interesting things – are the program.

Add power. Electricity tumbles in all sorts of different ways, and the track ticks into a different shape. Add a bit more power, and it ticks into a new shape again. And the shapes are controlled by whatever values we put in some special places – those values are the program. Keep adding power and with enough ticks, suddenly you’re browsing Reddit. You’ve got a computer.

Big marble race, like I said.

You could, quite seriously, design a “computer” to run a modern operating system using marbles. It would be enormously big and INCREDIBLY slow – and the engineering challenges to make it work would likely be insuperable – but in principle you could do it.