What does the code that makes up programming languages look like?

1.11K views

Take a language like Java. How was it originally created? I can’t wrap my head around how someone invented a computer language to run without having some “prior” language that it allows the first lines to function. Is it just Java all the way down, like someone wrote a single line of Java and then every other line was built on that?

What about the first computer language? What was the basis that that functioned on?

Thanks for any help, I hope that was phrased in a mildly intelligible way.

Edit; I’m trying to think of it like human language: at some point there was a first “word” spoken by someone and understood by another and from there the structure started to be born. What were the first “words” on a computer that led to where we are now?

In: Technology

36 Answers

Anonymous 0 Comments

Imagine you had a box full of Legos, but they’re all the flat 1×2 pieces. You could probably build anything you wanted, but it’s going to be a ton of work. That’s like the simplest, earliest computer languages, and represents the language the computer actually “understands”.

Newer computer languages are like the bigger, more complex Lego pieces. Instead of building a wall from scratch, you can just use a wall piece or an engine piece. That’s like newer programming languages. They are just bigger Lego pieces that let you build stuff faster, but at the end of the day, it all boils down to smaller Lego pieces that the computer actually understands.

Anonymous 0 Comments

Modern languages are what’s called high level languages, this means that they’re fairly understandable but us humans. Inorder to run a piece of code must be compiled.
Most code is compiled ina compiler written in that language but the first compiler would be build in the language before it… All the way back to basically coding in raw binary or with punch cards. You make a program in raw binary at much expense and effort that can interpret a higher level code and compile it down into simpler code only the machine can understand. You use this for a while then build anew compiler in your new language, compile it with the old one and now you nmdont beedtheold language at all. It’s about of a cycle

Anonymous 0 Comments

Alan Turing, the father of modern computing, used a giant machine with cogs to decrypt nazi messages in ww2. With the aid of mathematicians, he would take very large numbers and “encode” them into cogs, then run the machine and check the result. The cogs were arranged in such a way that as they moved, they would affect each other in a way that could be seen as “calculating”. The result would help them discover what the encrypted messages could mean.

Try to look at languages the same way as you look at the cogs. We “encode” some meaning onto the machines, and from this we create language. No different to how each word you are reading right now is really an encoding of symbols that represent sounds of a language you understand.

We simply make a machine that does an operation and then we say “let that operation, represent this thing”.

Java is decades ahead of Alan Turings Bombe machine, and he himself was decades ahead of the ideas that sprouted computer science, particularly set theory. Algorithms as an idea has also been around for centuries, so it’s hard to pick a point and say “it started here”. Many of computer science ideas are taken from maths.

Anonymous 0 Comments

[removed]

Anonymous 0 Comments

First there was binary. Then there was Assembly based on binary, which are a set of extremely elementary instructions about running memory instructions (stores, loads, jumping to different lines, setting flags, addition). All languages then stem from Assembly, or other languages.

Anonymous 0 Comments

There is a field called theoretical informatics which contains a field “formal languages & automata ([https://en.wikipedia.org/wiki/Formal_grammar](https://en.wikipedia.org/wiki/Formal_grammar))”. People in this field (around 1950, chomsky) tried to come up with grammatics using a certain set of symbols that could only be interpreted in one way (without giving a context).

Anonymous 0 Comments

The code is assembly where you manually instruct the computer to set values, move values, and perform mathematical operations on those values. The values are 1’s and 0’s,or sets of those. Everything that a computer does is represented by these values. However, programmers created layers of code to abstract from this hard stuff. They created simple commands to perform dozens of assembly actions at once. Over the years, it got easier to write code as the layers were added and perfected and standardized.

Programming languages are higher layers that interact with lower layers in different ways. Some are more efficient at performing tasks. Some are easier to understand. Some make code that can be reused more effectively and shared. Which one is best is subjective.

In the end, it is all about talking to the various parts of a computer and moving values from point A to point B. Everything else is about aggregation of combinations of those actions.

Anonymous 0 Comments

Lots of them are written in C, or (like java) run in VMs written in C or C++. The C/C++ compilers (GCC, Clang, etc) are also written in C and C++ (i.e., self-bootstrapping.) The early C compilers were bootstrapped with a minimal version in a hardware-dependent assembly language that could compile the C version. C is still heavily used.

Anonymous 0 Comments

It’s a chain.

The first program was very basic, and could only understand on and off. But a human could use it in a clever way to make a second program which could understand a set of different on and off settings in a pattern, like when you see someone hold up two fingers instead of one, and that means something.

The next program after that could understand where one pattern like that ended and where another began. At this point, it’s like words, only more basic – just lots of “on” and “off.” But see, here it gets useful, because these are like symbols that the computer understands. It can move electricity in patterns defined by these sets of on-off.

At this point programs were just sets of cards with holes in them that a computer would read. There were no monitors or keyboards.

Well, that program was used to make a new program that not only understood these patterns, but could show a human on a screen letters and numbers that represented those patterns. And with those letters and numbers – stuff like “PUSH 1” and so on – it was a lot easier to write more complicated programs.

This is where it really took off. Now people were telling the computer how to understand even more words and turn those into long, long sets of on-off that it could use to direct electricity in _really complicated_ ways.

These “compilers” are what form the basis of all programming languages today. Programs that teach computers how to understand other programs.

Sorry if that was too long.

Anonymous 0 Comments

You don’t need a computer to create a programming language. A programming language is at its most basic just a set of rules defining the syntax and behaviour of the code.

What you need the computer for is the write the compiler (and/or in the case of an interpreted language, the interpreter/runtime environment) and libraries/API. Once you have the compiler, the libraries *can* be written in the same language, but often they are written in an another language (usually a lower-level language or even partly assembly).

Compilers/Interpreters are very often written in the same language they compile (self-hosting), but obviously that’s a cyclic dependency – the first/earliest versions would have been written in prior existing programming languages.

Taking Java as an example: the official Java API is mostly written in Java, with parts of it in C or C++. The official compiler (Javac) which converts Java source code to Bytecode, is today written in Java but originally was written in C and C++. The official Java Virtual Machine (JVM) which runs the bytecode on the various platforms that Java supports, as far as I know is always written in C. Note that I keep saying official – anyone is able to write their own version. Oracle’s licensing shenanigans aside, nothing stops you writing the JVM in C#, or Javac in JavaScript.

C++ is a special case; the first C++ compiler was written in C++, but my understanding is there were some ugly hacks and conversions to actually get it compiled in a C compiler. The first C compiler originally evolved from a language called BCPL. Go up the chain far enough, and eventually you reach someone who was programming a computer by writing assembly language, pulling wires, flipping switches, or typing on a punch card.