It depends on your view of what a computer and a program are. Here’s a list of things people generally consider to be machines running programs:
1. As early as the 9th century, a programmable music sequencer was invented by the Persian Banu Musa brothers.
2. In 1206, the Arab engineer Al-Jazari invented a programmable drum machine where a musical mechanical automaton could be made to play different rhythms and drum patterns.
3. In 1801, the Jacquard loom could produce entirely different weaves by changing the “program” – a series of pasteboard cards with holes punched in them.
The first computer program is generally dated to 1843 when mathematician Ada Lovelace published an algorithm to calculate a sequence of Bernoulli numbers, intended to be carried out by Charles Babbage’s Analytical Engine. However, Charles Babbage himself, wrote his first program for the AE in 1837.
In the 1880s, Herman Hollerith invented the concept of storing *data* in machine-readable form. Later a control panel (plug board) added to his 1906 Type I Tabulator allowed it to be programmed for different jobs, and by the late 1940s, unit record equipment such as the IBM 602 and IBM 604, were programmed by control panels in a similar way, as were the first electronic computers. However, with the concept of the stored-program computer introduced in 1949, both programs and data were stored and manipulated in the same way in computer memory.
A lot of comments address binary and logic gates, but your question is actually more subtle. At a fundamental level you can think of a computer as being all about doing math and manipulating numbers. But in order to “know” what the numbers mean, we have specialized circuits. Like for example, a speaker (really a DAC) interprets some numbers as sound. You monitor is a circuit that interprets numbers as images. A car has circuits that interprets numbers as accelerate or brake the car. Conversely, your mouse is a circuit that interprets movement as numbers. Your keyboard interprets keys being pressed as numbers. So you can think of the computer as being the brains that takes all the incoming numbers and has the logic to convert them to outgoing numbers.
Let’s say you have a wire, and you can control the electricity current. I can arrange with you to zap me between 1 and 2 volts when you mean to say YES, and between 5 and 6 volts when you mean to say NO.
With that info, you can effectively have what you call 1s and 0s in a controlled setting, the 1 is current between 1-2v and the 0 is 5-6v. Apply basic programming fundamentals and you can build logic gates for example, receive 2 inputs and evaluate them, get a result, boom! Your first calculator is here.
I made up the numbers to make it easier to understand btw.
Heyyyyyo.
Whenever anyone asks this, I direct them to Harvard’s brilliant online CS50 YouTube channel.
The professor is crazy good at explaining it. He does the same series of lectures every year and throws them up on Youtube and they’re all fantastic.
[https://www.youtube.com/live/2WtPyqwTLKM?si=A6E2Vkh1vIKdq1zb&t=1494](https://www.youtube.com/live/2WtPyqwTLKM?si=A6E2Vkh1vIKdq1zb&t=1494)
Anything can be made into a computer, once you understand unary, and binary, and different base methods…
Like:
[https://youtu.be/IxXaizglscw?si=ztZ40vAiCO7zriC0](https://youtu.be/IxXaizglscw?si=ztZ40vAiCO7zriC0)
THEN.
Once you understand logic gates………….
You can try playing the games Shapez and Shapez 2 (which came out last month and I’ve been horribly addicted to).
And you can literally build a computer IN THE GAME to solve the game.
For example:
[https://youtu.be/Pn4OPFuUgfs?si=xA_fy5a2fuW_Bkl1](https://youtu.be/Pn4OPFuUgfs?si=xA_fy5a2fuW_Bkl1)
There’s also the Minecraft computer that someone spend an unfathomable amount of time creating…
[https://youtu.be/dV_lf1kyV9M?si=fx7Qi0EBopaTZYKd](https://youtu.be/dV_lf1kyV9M?si=fx7Qi0EBopaTZYKd)
Then you can create Minecraft within Minecraft… If you wanted to… For some reason…
[https://youtu.be/-BP7DhHTU-I?si=sOJDeijoS9rvXipz](https://youtu.be/-BP7DhHTU-I?si=sOJDeijoS9rvXipz)
The first computer I saw was programmed by a set of manual toggle switches.
There were eight toggle switches, each representing one bit. At start of day the process was as.
Flip the eight switches up/down according to a written instruction. Hit the ‘submit’ button. Then repeat for the next byte.
This would crest a ‘boot loader’ by putting instructions into the first few dozen memory locations. When the ‘run’ button was pressed, it would load the main program from magnetic tape.
It was impressive seeing how quickly the operators could do all of this from memory!
>How was the first computer programmed?
It really depends on what you mean by “computer”. In my opinion, the most basic definition of a computer is a device that does something based on a certain input. Computers, at their most basic level, are just performing a whole bunch of simple logic evaluations in rapid succession. We define these logic evaluations in a standardized way we call “Boolean Logic”.
For example, ‘AND’ is a common Boolean Logic operation that can be expressed in English as “IF INPUT ‘A’ IS TRUE, *AND* INPUT ‘B’ IS TRUE, THEN OUTPUT ‘Z’ IS TRUE”. Notice the English sentence describes everything you need to know. What if input A is not true? Well, the statement tells you you need both inputs to be true before the output is true, so in this case the output is false. This is a trivial example, but it really is the most basic level of computation, and *everything* in modern computing is built upon simple evaluations like that.
In its infancy, these logic operations were performed by intricate combinations of switches and relays, as early as 1886. Nowadays the technology is implemented on Integrated Circuits, but the function is largely the same. If you have a single chip with a single AND gate (the individual logic decisions in a chip are referred to as gates), then you have a computer that’s “programmed” to perform a single calculation very well. So, the first “computer with a program” was simply the first logic gate.
Of course, such a computer can’t have its program changed. A logic AND gate will always perform an AND operation, and that’s all it will ever do. Developers figured this out early on, and realized they needed a way to specify what inputs they wanted to apply to what logic gates, and what they wanted to do with the outputs, often feeding them into other logic gates to perform other operations. This is probably the first time we see what we would recognize as “programming”. It really was a major shift between organizing circuits to do a specific thing very well with little human input, towards creating a flexible system of circuits that a human could tell exactly what to do and where to send its outputs. This was usually accomplished by punch cards. Ada Lovelace is credited with the first written computer program, but unfortunately she died before her ideas really came to fruition and her ideas were just ideas. How you want to define a computer defines when you would say the first computer to use punch cards came about. Special purpose machines were created all through the 1800’s, but by the 1920’s they were more focused on basic math and less focused on special purposes, and personally I think that is a very big shift in computing. With this shift came the obvious and definite need to specify your inputs and decide exactly what you wanted to do with them, and voila computer programming was born.
Nowadays, there are several layers in between you and the machine that make you feel like what you are doing is more complicated than that, but it’s all a ruse. At its base level, all computer programming today is just a way to convert human language and concepts into a set of operations a computer can do with basic logical evaluations, and return the result we ask of it back into a useful output.
Latest Answers