Programming code tells the computer to do something, but what makes the code actually mean anything to the computer?

1.23K views

Programming code tells the computer to do something, but what makes the code actually mean anything to the computer?

In: Engineering

36 Answers

Anonymous 0 Comments

A computer is a just a machine. It really doesn’t understand what it has been instructed to do. It just follows a very long and complicated list of primitive commands, expressed as software.

Deep down, a computer can only do a very few things. Set a bit to 0. Set a bit to 1. Check the current value of a bit: 0 or 1.

The most primitive computer commands are IF, AND, OR and XOR

A programmer can instruct the computer to test the value of a bit using IF. IF the bit is a 1 (which we generally assign the value TRUE or ON) the computer will execute the next block of commands. IF the bit is 0 it will skip to the next command block.

The AND command compares two bits, and depending on their values will set a third bit to either 0 or 1. Binary computers are built so that

1 AND 1 = 1,
0 AND 1 = 0,
1 AND 0 = 0,
0 AND 0 = 0

Similarly, the OR and XOR commands will compare two bits and set the value of a third… but based on a different mapping.

Engineers have decided how they want to encode numbers and letters into a long combination sequence of 0 and 1. By convention (and there are different encoding practices), we generally think the number 0 as 0000 0000, the number 1 as 0000 0001, the number 2 as 0000 0010, and so on… this is binary encoding.

As it turns out, once a computer knows how to execute IF, AND, OR and XOR, it can be taught to perform Addition by comparing each bit from two encoded numbers following a sequence of primitive commands.

For Addition, this series of primitive commands not so different from “1 plus 1 equals 0, carry the 1…”. Adding two eight bit numbers can probably be expressed using a dozen primitive commands—maybe fewer if the programmer is very clever in writing the most efficient algorithm.

The computer doesn’t really know that the bits it is setting have any special meaning to us. It is just flipping switches on and off according to the commands it has received.

Of course, once it has been taught to perform Addition, it can be taught how to perform Multiplication by doing Addition in a loop, through computational brute force. Same then with Exponentiation.

Modern software languages do not require a programmer to deal with bitwise commands any more. All the primitive functions have been written into the hardware, into the operating system software, and into the software compilers.

Programmers these days just say “Multiply A times B”, and the compilers convert that abstract intention into the very long series of machine instructions that deal with flipping the right switches to 0 and 1, in the right manner, and then sending the resulting value to output.

You are viewing 1 out of 36 answers, click here to view all answers.