What does the code that makes up programming languages look like?

1.18K views

Take a language like Java. How was it originally created? I can’t wrap my head around how someone invented a computer language to run without having some “prior” language that it allows the first lines to function. Is it just Java all the way down, like someone wrote a single line of Java and then every other line was built on that?

What about the first computer language? What was the basis that that functioned on?

Thanks for any help, I hope that was phrased in a mildly intelligible way.

Edit; I’m trying to think of it like human language: at some point there was a first “word” spoken by someone and understood by another and from there the structure started to be born. What were the first “words” on a computer that led to where we are now?

In: Technology

36 Answers

Anonymous 0 Comments

The very first “languages” weren’t anything you would consider to be a language. For example, index cards with holes punched in them to represent 0 or 1. A “programmer” would write 0001001001110101010 in a very precise way for that particular computer to move data around in memory, perform simple operations on it (ex: add, substract), or what have you. They were literally manipulating individual bits on the hardware.

The truth is, all computers still actually work this way! Down at the level of your actual CPU, the only language it can understand is still 0010010100101010. But what has happened since is what’s called “abstraction.”

For example, someone invents another language around the binary that contains larger “ideas” that would normally be expressed as longer binary. Instead of 0010101010010101, maybe the language defines it as “move 0010 to register 4” or “add the contents of register 2 and 6.” They write in that language, but the language is then translated back into its original 00101010 for the CPU to actually run.

Then someone else comes along and goes “hey so, having to remember all these registers and where things are in memory is a huge pain in the ass. What if we let people just define simple variables (x = 4, y = 3, z = x+y) and then we keep track of where everything is for them, and first we translate into “move 0010 to register 4” and then we translate THAT into 00101010 for the CPU to actually run.

Keep doing this 30-40 times over half a century, and you get to the modern languages we use today. But realize *nothing* *can actually run these languages as written*. They have no meaning to even a modern CPU. They have to be parsed, lexed, compiled, and assembled all the way down (through multiple intermediate steps) back into the same 00001010101010 that people were printing on punch cards 60 years ago.

Modern languages are really just window dressing. No matter what you write in, it all gets compiled/baked down into the same 0s and 1s that your particular CPU needs to run, because that’s all it can run. Languages are just layers of shortcuts and decorative fluff that we’ve build up on top. And all of the arguments over modern languages are mostly about the tradeoffs of one cosmetic change or shortcut or another. The CPU doesn’t care.

You are viewing 1 out of 36 answers, click here to view all answers.