A computer chip works by sending it commands (instructions), encoded as ones and zeros. Every chip can understand a certain set of commands (instruction sets).
Before programming language code can be understood by a chip, it has to be converted to ones and zeros (binary code).
The lowest level type of programming code is called “assembly code”. It is the code that is most easily translated to binary. The conversion to binary is called “compiling”. But it is also difficult to write and maintain for humans.
This is why so called “higher level programming languages” were invented. These are easy for humans. They often don’t compile directly to binary, but first to a form of assembly suitable for the target instruction set, and then to binary.
Finally, there’s a class of programming languages that doesn’t compile to binary at all. They require a compiled program to execute the code. That program sends the instructions to the chip, while interpreting the code of the programming language.
In some languages this is not done directly, but uses an intermediate binary format, like that understood by the chip, but instead understood by the program. Which converts it to binary as understood by chip. Such a program is also referred to as a Virtual Machine (not to be confused with full computer emulation).
Latest Answers