When we think of modern coding, we think of Python and Rust and Swift and Ruby and so on.
My question is more abstract. How exactly did computer scientists develop the original language used to tell a computer what to do and when to do it? How did they teach the computer to recognize that langauge?
Going even further than that, how did current languages get developed? Did they just rewrite the original computer code from scratch or are all modern computer languages designed to communicate with this baseline original code which all computers were programmed with?
In: Technology
The processor itself is just an electric circuit.
The designers of the processors make it with pins for input and output signals, and then document what signals will make computers do what. So it will be something like
(numbers are made up):
When a signal on the pins is 01001011 00000001 00000010, the first 8 signals is the code of the command, and the next 2 blocks are the numbers to add.
The commands and their logic is done by CPU designers.
So this is machine code – the way the CPU actually works.
Then you can literally write those bytes and have a first program.
Then you can use that program to make a program that transforms simple text into binary code.
Then you can iterate on that program and make more and more complex langauges, using existing programs to simplify your work.
To see the bits & pins docs for yourself, you can look here: [https://www.intel.com/content/www/us/en/developer/articles/technical/intel-sdm.html](https://www.intel.com/content/www/us/en/developer/articles/technical/intel-sdm.html)
Beware, it’s a long document (5k pages!) describing EVERYTHING about modern x64 CPU pins.
>Did they just rewrite the original computer code from scratch or are all modern computer languages designed to communicate with this baseline original code which all computers were programmed with?
There is no “baseline” code, each CPU (and even different OS) have different binary code. What some people do is having “Compiler toolkits”, like GCC or LLVM, that allow you to have abstract code with modules for different CPUs and different languages, so when you make a new language, you can write the YourLanguage->LLVM generator, and then LLVM will already have x64/ARM/Linux/Windows generators.
At it’s core a computer only does one thing, add two numbers together really really fast. Then by combining additions you can do more interesting things. If you add the same number several times you get multiplication, if you add a negative number you get subtraction, if you add a negative to a positive you can deduce which number is larger from the result. By 1978 we’d invented lots of clever ways to add two numbers together (as well as store and retrieve them) and created the list of options called the instruction set that we pretty much still use today, 8086.
Yes, very smart people wrote code like this. Later on they added layers of interpretation to make the instructions more human readable and then more layers to make it even more readable then even more layers to make it more human time efficient. But even when you write in python it still gets compiled down to a series of the 8086 instructions (there’s 117 different instructions) for the computer to understand.
The ‘original’ language is basically designed by the manufacturer of the hardware, you could say it is part of the instruction manual for the CPU! Just like you need to know how to program a clock or a set up a fridge, the CPU manufacturer provides instructions (the language) on how to use the different functions of the CPU. Over time the language has become standardized, so that programmers don’t need to learn another language for each CPU (there still are different languages but x86 is most popular). All other programming languages use the ‘original’ language as basis.
it began with logic ics. the well known breakout videogame created by the great steve wozniak for example has been developed with no code inside and 41 (or 42?) logic ics. with boolean logic, you can create mathmatical operations. if you want to try around a little, you can use programs like dreams on ps4/ps5, blender game engine 2.79 und upbge. you can download pocketcalculators that have been built with those systems.
Go to youtube and search up Ben Eater. He builds a full working computer from the ground up including all the software and hardware using simple push together ‘breadboard’ techniques and explains it all along the way if very digestable ways. His whole series is many videos but the first one or two are all you need.
The “original language” is designed by whoever makes the processor (or other computer chip). The chip has physical pins that attach to the circuit board. Some are inputs, they receive a signal, and some are outputs, they emit a signal. When the input pins receive a signal they do something based on how they are designed. A very basic operation might be to add two numbers. For this to happen the processor will receive 3 signals. First is the operation to perform (in this case add), next is the numbers to add. It will perform the operation and output the result on its output pins, which either the processor itself or some other chip can then use.
All of the pins and the commands the chip can respond to are documented by the chip maker. Anyone with this information and a little electrical engineering knowledge could then wire up the chip to do something.
But writing programs in this code, called machine code, is inconvenient because it’s usually much to basic on its own. To do things like draw a circle on the screen you need to write hundreds or even millions of machine code commands. Think of it like ordering a hamburger at a restaurant. You don’t tell the waiter all the individual steps to make to prepare a hamburger (unless you are nuts). The chef knows what a “hamburger” is and translates “make a hamburger” into the thousands of sub commands (slice tomato, put on bun, heat grill, etc) necessary. Fortunately there are people who do that for computers as well. They build languages on top of the machine code that make it easier to perform complex actions. And still more people build languages on top of those to make it easier to do various tasks.
Each language can be automatically translated into lower level languages until ultimately it’s just a list of machine code commands.
Funny enough I posted an answer earlier that had to go over this sort of thing. Now I need to fill in the gaps.
The first computers were hard wired. You physically changed the wires around to reprogram it. I don’t think I need to explain how or why that would eventually get messy, since a hard wired computer can only run a single program before needing to be rewired and debugged- sometimes as literally as it originally meant getting insects out of the wires and circuits so they didn’t mess up the program operation.
Other answers here have gone over binary code that gets fed to modern computers and that is the first software programming language ever used on any computer. You just needed some way to feed that information to the computer itself so it could operate it- see above for the baseline of every computer past the first.
Funny enough, there’s no universal binary code, either. What runs on the 6502 won’t work on the z80, and you’d have to account for the differences in microcontroller code in order to run similar software. But it is upon machine code that every single operation your computer today performs, and the languages built upon languages built upon languages is such a marvel I keep wondering why nothing breaks- well, more than it already does.
So how was it developed? Computers are made of circuits, and from the most simplistic NOT gate you can create every other circuit necessary to make a computer with programmable codes. The NOT gate being the foundation of every circuit in a computer is why you can build fully functional computers in Minecraft redstone without any command blocks. Look some of those up, they’re crazy.
All computer code is binary in its essence.
Think of a calculator, 1 number uses 7 lights to display a number thru visual medium.
Binaey 0001 means 1, means light up all diods on the right side, 1000 means 8 which means, light up all diods. 0000 means 0 which means light up all except the middle one.
In the background, theres physical transistors placed in a logical way, to create a logic board that allows diods to light up in a shape that our brain interprets as a 1 or 0 or 8.
Behind the scenes is still just 1s and 0s. Everything else is just a user interface we use to interpret the code.
Example, binary 0-9. You need 4 bits, or 4 buttons, if you press down the two middle buttons, that means 6, if you press the 2 outer buttons that means 9.
Now instead you have 10 buttons, you’ve drawn a symbol on each, when you press the button with a 9 drawn on it, that button then also physically press down the 2 hidden from display outer buttons. So to you, you only pressing 1 buttons, but the inner workings actually presses down 2 buttons.
Trying to keep this ELI5 instead of writing basically computer science 101 in a reddit thread
Latest Answers