So, you may have heard that everything computers do is just operations on 1’s and 0’s. Basically, a computer is full of tiny light switches that can be turned on (1) or off (0). Certain combinations of off and on switches mean certain things to the computer, like “go look in this spot for what you’re looking for”, or they can represent actual data (for example, “01001000 01100101 01101100 01101100 01101111” spells out “hello” in binary). Computers like binary (1’s and 0’s), that’s all they do and all they can understand. However, humans have a pretty hard time reading and writing in binary. Look at how hard it was to read “hello” and think about how difficult it would be to read and write complicated instructions, or whole paragraphs of words that you wanted to store, if you were the one writing it out with 1’s and 0’s.
That’s where programming languages come in. You want to write instructions, but you want to write them closer to English than to binary. For example, you may want to tell the computer to display the word “hello” on the screen. There’s a big spectrum of languages, with ones that are closer to plain English being called “high-level” and ones that are closer to binary being called “low-level” or “assembly” languages. I’m going to focus on high level languages, since that’s probably what you see people use most often. There’s lots of different ones, like Python, Java, etc. They all (for the purposes of this ELI5) work pretty much the same way, but they have different rules for how you can write things. For example, in Python you would write ‘print(“hello”)’ and in Java you would write ‘System.out.println(“hello”);’. The only difference is that within each language, everyone agrees to write things in a certain way and use certain keywords to mean certain things.
To write in a high-level language, you wouldn’t write in a Word document because that has a bunch of extra information (like what font you used, and what size your text is) that the computer wouldn’t need or understand. But you can just open a plain text editor and type code into an empty .txt file. The trick is, you have to stick with only one language and use all the right syntax and conventions of that language, like putting a semicolon after each line or using tabs to properly align things. This will be important for the next step. You can also use code editors to do this, but they really are just software that gives you hints, like reminds you that you forgot a semicolon. You can copy and paste the text right out of the fancy editor and into a .txt file and it will run exactly the same, so the important thing is just the words you wrote.
Now, you take the .txt file and name it something like .c or .py so that the computer knows what language you wrote it in. Then, when you try to run your program, a chain of events will occur to translate your high-level code that follows a certain languages rules and looks kind of like English, into straight up 1’s and 0’s that turn the computers light switches on and off. Something called a compiler will take the high level code and replace each instruction with more basic instructions, called low-level code. So, if you write “y = 2 * 5 + 4”, it might turn that into “x = 2 * 5”, “z = x + 4”, “y = z”. Then, it will get further replaced by instructions that tell the computer something like “before, we stored the value of y at lightswitch spot 1234” so now instead of “y” the instruction will read “1234”. This is assembly language. Everything gets turned into a lightswitch spot and whether the lightswitch at that spot should get turned on or off. The compiler that does this is just another program, but it’s not one that programmers write. It was already written and we generally trust it to function properly. Now you have everything in binary, just a bunch of 1’s and 0’s, and the computer turns its light switches on and off according to those instructions. Those light switches being turned on and off is what makes everything on your computer happen, and if you’ve written everything right, all the correct light switches will be turned on and off and your computer will do whatever your program told it to do.
This is a kind of simplistic view, but that’s really what’s happening. John/Hank Green also have a good video series explaining the levels of abstraction, which I think is what you’re trying to get at. It’s a really complicated concept and hard to wrap your brain around at first, so don’t feel bad if it seems confusing.
Latest Answers