Programming code tells the computer to do something, but what makes the code actually mean anything to the computer?

1.19K views

Programming code tells the computer to do something, but what makes the code actually mean anything to the computer?

In: Engineering

36 Answers

Anonymous 0 Comments

The compiler translates code from the words we write into numbers the computer understands.

Anonymous 0 Comments

Programs are usually written in a “programming language” that is easy for people to learn and use. Then there is “machine language”, basically a string of 1s and 0s arranged in complex patterns that a computer can understand.

In-between the person and and the computer is a special program called the “compiler”. It takes your programming language and turns it into machine language. They’re like an interpreter; if I need to talk to somebody who speaks Russian, but I only speak English, I have to find someone who speaks both to translate what I’m saying.

Why don’t programmers all just learn machine language? Well, it’s really, really hard. And it takes a long time to say anything. So the smartest ones who can speak it come up with a programming language that the rest of us can understand. Then the first things they do are write a compiler and publish a dictionary and rules of their language.

Source: am programmer.

EDIT: If you’re also curious about what the 1s and 0s mean to the computer, check out some of the excellent engineering comments.

Anonymous 0 Comments

I think what op is asking is how does a computer take binary machine code and know to run it. Is there code built into the processor that accepts the machine code as a “fuel”? What decides what code to run at what times (queue, priority etc)

Anonymous 0 Comments

If you want to dive deep in this topic, pick up Code: The Hidden Language of Computer Hardware and Software. It starts with basically wire and electricity and explains how to build a computer from scratch

Anonymous 0 Comments

I’m sure someone can explain it better but:

At the most basic level, a computer is just a set of transistors. They are switches that can be “on” and “off”, which are translated to 1 and 0 respectively.

What programs do is they give define the context for what patterns of 1s and 0s mean in different scenarios. For example: 01000001 in binary is the decimal number 65. It can also be the uppercase letter “A”. It all depends on the context provided by the “header record” in the program which explains to the computer which way to interpret that specific pattern.

Programming languages offer a necessary shortcut that allows programmers to write many instructions with minimal effort. If all programmers had to write binary there wouldn’t be many of us around

Anonymous 0 Comments

The computer’s built so that coded patterns of 1’s and 0’s physically open / close different paths for electricity, to make it do different things. These coded patterns are called “machine code.”

The details of exactly what patterns are available, and what they mean, can be different for different models, brands, or kinds of computers. A CPU manufacturer typically publishes a manual with a complete specification of the patterns.

Working directly with the coded patterns the computer actually uses is inconvenient for human programmers. It would also be more efficient if the same program could be used on multiple models / brands / kinds of computers.

So people created programs (compilers, interpreters, shells, JIT’s) that allow the computer to “understand” English-like commands. This involves a “translation” process, sort of like translating from German to Italian. (Except the computer is, well, a computer, so it expects programmers to use perfect spelling and grammar, but will happily translate a buggy or completely nonsensical program as long as it’s grammatically correct.) It can be done in a few ways:

– A compiler works like translating a novel. A long program’s translated all at once, then the result’s saved in a file that the computer can run.
– An interpreter analyzes one “sentence” (line of code) at a time, runs that one, then moves on to the next. Sort of like translating a novel out loud as you read it.
– A shell lets you type a line of code, immediately runs it, then shows you the result. Sort of like when a businessman or government official goes to a foreign country, they might bring a human translator who translates each sentence immediately when they say it.

Anonymous 0 Comments

[removed]

Anonymous 0 Comments

At the human end we have high level languages which are easy for us to understand and write but which need to be interpreted or compiled to machine code before the computer can run it. Interpreted languages are slower but more immediate meaning you can run them straight from the source code (classically BASIC, more recently Python) whereas compiled languages are faster but need to go through a compilation step to be turned into executable code.

The machine code is a sequence of instructions that the CPU understands and would typically be different for each type of CPU. For example, an Intel processor can’t run the same code as an ARM processor even though the source code can be the same and the compiler would turn that into binary machine code for the specific type of CPU. At the CPU level itself there is memory (registers) and instruction decoders which then execute the binary code.

A recent wrinkle would be the Java Virtual Machine where the Java source code is compiled into a special byte code that the JVM understands and then issues CPU specific code so while the Java source code is compiled, it is designed to be an intermediate which can run on the JVM and then you just need a JVM that is compiled for your specific CPU which means languages like Java and their ilk can produce code that runs almost as fast as natively compiled code but will run on any CPU that has their virtual machine.

Anonymous 0 Comments

I used to ask this question all the time, and the answer was just ‘ the compiler ‘. How does the compiler work ? No one could answer that one.

Anonymous 0 Comments

At it’s heart you can think of a computer like a programmable calculator. It can add, subtract, tell if two numbers are the same, bigger, or smaller, and do different calculations depending on the result (if bigger, do A, if smaller do B, and the calculations continue). More complicated operations can be done based on these simple ones. We humans have devised ways of describing behavior, and translating into those calculations. This is basically a programming language.

It would be tough for a five year old, but not impossible to just read one of these instructions for the computer, and figure out what it does. Some of them are pretty much the same as you’d see in code. “C = A + B”, could be “add A, B, C”, where A is the result, B and C are the values to add. This is then turned into a number that the computer understands.

How the computer understands this number isn’t exactly easy for me to explain. It’s quite complicated, even though the math is often very easy. The principle isn’t that different from previously described though. We’ve figured out how to use electricity to do basic math. That number I mentioned is turned into the math. I’m too tired to explain from there, but that’s essentially how it works.