Programming code tells the computer to do something, but what makes the code actually mean anything to the computer?

1.23K views

Programming code tells the computer to do something, but what makes the code actually mean anything to the computer?

In: Engineering

36 Answers

Anonymous 0 Comments

OK let’s try philosophy for a 5yo. 😀

I think to really understand what things “mean” you need to know what „meaning“ (in your question) means for your computer.

Of course there is bits and bytes and CPUs and machine codes and programming languages and Fortnite and so on, but that still does not really tell what things “mean”.

*** NOTE: There are some differences between how a “real” computer works and what I describe here. I will lay out these in the end. I have good reasons! 🙂 ***

I think the missing piece to what others pointed out (correctly) is: what can computers actually DO? Because this is what the „code“ is; it’s instructions to DO something. And this is the only “meaning” things have for a computer: the code tells it to execute operations. What these are, what they lead to – well, the computer does not know, of course.

But what are those commands? What does a computer do? Even the most advanced hardcore gaming PC is, in the end, a very simple machine when it comes to do things. Let’s create a list of what it can do:

1. it can store natural numbers. So you can, simply put, have a place in the computer that has a name, and that you can put numbers in. Let’s call those places variables, and they have names: a,b,c,d, …

2. you can set those variables to a number you like: a = 1337

3. you can add or subtract a constant value from or to a variable: a = a – 10

4. A loop: you can repeat a set of commands. The loop stops when a variable you specify is 0.

This. is. it. Not too bad, is it?

And your code tells the computer to do one of those things, one after another. That is the meaning of code.

Now there are two more immediate follow-up questions, and I’m afraid we need to grow up a bit now (I’m not clever enough to describe this simpler).

1. I don’t believe you. If this is all, why the heck can we play Fortnite in HDR?

The pixels you see on your screen are represented by variables. So basically, if your screen is black to start with, to set one pixel to white, you need to figure out which variable does it, and set it to the number that represents white. The computer has no idea what it’s doing, has no concept of white, so this is the only meaning it has to understand. This is true for every function your computer has that does not seem to have anything to do with numbers: for the computer, everything is just a number. We provide the meaning with output devices and by experiencing the output.

2. This still sounds wrong. All computer languages I know have waaaaaaay more commands than three.

True. This is where what others here said comes in. Building Fortnite only with those three commands would be tedious, may be even impossible as we would have a very hard time to read and understand what’s so easy for the computer to run.

So we build layers of complexity on top of this:

The CPU (the thing that runs your computer) itself already has a ton of helper functions. For example, the three commands do not allow you to add one variable to another, you need a small program with multiple commands! How tedious. So, the microcode of the CPU includes this program and you can simply say c = a + b. (not completely true, see far below – but it illustrates microcode). There are a lot of these. Some are build physically with wires and transistors, some are actual little programs within the CPU; the difference does not really matter, because: they still do not create more “meaning” for the computer; and all it does in the end is executing commands 2-4.

The CPU commands have numbers as names (as they address variables that contain the little programs!) – this is super tedious. So we gave them names and wrote a program (Assembler) that converts names humans can remember to those numbers. So instead of saying 16 15 12 (add (command 16) the number in variable 12 to the number in variable 15) you write add b,a and the Assembler will translate.

This is still super tedious, so we build programming languages with really high level functionality. Suddenly, numbers can be anything: they become text, they become 3D objects, they become network packages being send around the world. But this is all meaning added outside of what the computer “is”.

Even after years of computer science I find it fascinating that these four basic commands are all that you need to build… all this! (theoretically)

========

*** Real computers and theory ***

What I described are the theoretical capabilities of Turing machines (actually, WHILE programs, but they are equal in complexity, in what they are capable of). Read up on this stuff, it’s brillant.

Computers can do less than these machines, because computers have limited memory. Computers also do work differently physically due to this fact.

I wanted to add this to minimize yelling and shouting, because obviously numbers are not added in loops by a CPU, but by using parallel addition units.

So, modern CPUs can physically add two numbers directly without the need for a loop, similar to how we add numbers manually:

435 + 123 = (4+1) (3+2) (5+3) = 558

One might say, hey but this is better than our three commands! It adds something! But: this only works because the maximum value for each variable is fixed; if you add numbers bigger than what you planned for when you build your computer, it breaks apart; the size is limited, so you KNOW how many digits to add BEFOREHAND. For possibly infinite numbers you would still need to loop to iterate over all digits of the number until you are done.

Also there are specialized units that are build with special mathematical purposes (floating point operations, matrix operations). They work a bit differently when it comes to how to interpret variables, but the available complexity is, in the end, the same.

Furthermore, when I say “the CPU in the end only RUNS those three commands”, this is not meant literally. Obviously the CPU works differently. But from the information theory standpoint, even while working differently, it can not do more than what you CAN do by running those three commands. I abbreviated this for the sake of a potentially 10yo.

I have the feeling that we now have reached adulthood and should leave it like that. But I hope this little trip into theory was worthwhile, and I also hope I did not mess up too many theoretical terms, as English is not my main research language. Plus cellphone.

You are viewing 1 out of 36 answers, click here to view all answers.