This is a pretty advanced question that I’m not sure I’ll be able to ELI5 but I’ll try my best.
First, people *dont* write code in binary, at least not directly. I really don’t like the colloquialism that computers only read 0s and 1s. While that’s true, it’s really missing a lot of what’s actually going on. For one, you thought programmers write code in 0s and 1s. They don’t, and if they did, the tech industry would be decades behind.
What they actually do is write code in high level languages that tries to be as human readable as possible. You can create a Person object that has an age and name attribute. You can compare Person objects by their age or name, or both, or anything you want, really.
This way, if you had a bunch of information about people, you could just create a Person object for each data entry and have whatever data associated with them. It makes it really easy to understand what’s going on.
Even though it’s supposedly the easiest, if you don’t have experience with coding I’d assume a lot of the words I just said seem like nonsense. That’s because even though it’s trying to be as readable as possible, it’s still a programming language and you won’t get away from programming concepts and paradigms.
Abstracting any further to a literal human language would lose all the meaning of programming. At the heart of it you have to tell the computer *exactly* what to do and that just isn’t possible with human language, it has to be derived from computing concepts and be molded to try to be as human as possible.
Anyway, these high level languages get “translated” into lower level languages. There’s a program called the *compiler* that takes code from language A and turns it into code for language B. These are very complicated programs, of which people spend their lives researching and working on. Compilers, along with the operating system, are some of the most complex and rigorous programs humans have created.
Again, the compiler is the program that turns your Python/Java code into code the computer understands, I.e the 0s and 1s. However there’s a couple steps in this process; we don’t go from Java directly to 0s and 1s.
Java is special because Java has an extra step most languages don’t. I won’t get into it because it’ll just needlessly complicate things.
Let’s say we’re just compiling a C code. This takes your C code and the C compiler turns it into assembly language. Then, the assembly language gets compiled (by a different compiler) into machine code, which the computer can then execute.
There are few other steps in between, but they needlessly complicate things and aren’t necessary to explain how this process works.
It’s basically just a series of steps of turning something from language A into language B. The exact process is very formal and complex, and this process has been researched extensively. The compiler is one of the most important programs as it directly interacts with any other program written in that language.
Not just that, but the C compiler has to be backward compatible. There’s a *lot* of standards the C compiler had to conform to, and if it doesn’t it will break a lot of things. Just take that the compiler is one of the most delicate systems humans have made and most people have no idea what it even is or how important it is. It directly affects every program written in the respective language and the compiler needs to be consistent; you can’t release a new compiler that would then break previous code (preferably).
Latest Answers