Computers understand a number of languages. 0s and 1s are at the lowest level. Above that there are a number of languages that computers use but they all tell the language under it to do something until, yes, finally you get down to the 0s and 1.
The reason why it’s not other numbers is because 0 means off and 1 means on. Things literally turn on and off. It’s a bit hard to have something not be on or off……… that’s the realm of quantum computers which are being worked on but are not stable enough to work.
Sure you could. Why would you though?
Binary can represent anything any other number system can represent. It’s also waaaaay easier to have only two “digits” or states to deal with.
Current computers only really need to distinguish between “0” (no electricity) and “1” (electricity). If you start working with more complicated number systems, you end up having to measure the actual value of the electricity running through your system, rather than just if it’s on or off.
Even more, computers don’t deal with numbers as much as they do logic. In logic, you only need two values, “True” and “False”. Adding extra values, or “digits” to your computer gets redundant for a lot of what the computer is trying to do: follow logic that the designers created it for.
I don’t like saying “computers only understand 0s and 1s” it’s technically not true, computers don’t understand that either. They just understand different voltage signals, anything below a certain voltage is treated by the internal circuitry as a “low” of “off” value, and anything above another threshold is “high” or “on”
Since they can only understand two digital values, the most logical thing to implement is binary, which we do by creating logic that can treat “off” and “on” as 0 and 1 digits in binary, and perform operations with binary numbers represented as voltage values, but again at no point a computer knows anything, it’s just wired by us to treat voltage like we treat the digits 0 and 1
There’s two answers:
1. Analog computers exist. They just aren’t particularly common.
2. You could build a computer that uses some other number base, it would just be more complicated. Building a physical circuit that takes one of only 2 states (logically interpreted as 0 or 1) is relatively simple (e.g a semiconductor with only 1 bandgap between two amounts of energy). But doing the same thing to distinguish 3 states (0 , 1, or 2) requires more complex circuitry. This problem is amplified when using mechanical systems (e.g. vacuum tubes) to implement distinguishable states.
You could use any arbitrary range of numbers/values if you wanted. It just gets significantly more difficult.
0 and 1 is essentially off and on. Or low and high.
In terms of computers this, at its most basic level, is usually a voltage. Say 5V high and 0V low. Very easy.
But small variations might mean it’s actually 4.2V and 0.3V. no big deal, that’s close enough.
If you had three options then maybe it would be 0, 2.5 and 5. That’s less room for error and small variations could feasibly change a value.
For technical reasons it’s also much harder to reliably split a voltage rather than just give the full whack or nothing.
So 0 and 1 is generally the easiest and most reliable
We definitely can make computers that use more. The way normal computers work is that they treat a “high” voltage as 1 and a “low” voltage as 0. We define some cut-off value, and then we just make sure all the voltages are comfortably above or below that to avoid mistakes. We’re actually treating an analogue signal as a binary one. In other words, the voltage could take any value between actual 0 and some maximum, but we just care about high or low because it’s simpler.
The original design for a computer, Charles Babbage’s Computational Engine, actually worked as a decimal computer, taking the values 0,1,2,3,4,5,6,7,8,9. But this never got made. Then there are [analogue computers](https://youtu.be/IgF3OX8nT0w), which goes even further.
Computers understand a number of languages. 0s and 1s are at the lowest level. Above that there are a number of languages that computers use but they all tell the language under it to do something until, yes, finally you get down to the 0s and 1.
The reason why it’s not other numbers is because 0 means off and 1 means on. Things literally turn on and off. It’s a bit hard to have something not be on or off……… that’s the realm of quantum computers which are being worked on but are not stable enough to work.
Sure you could. Why would you though?
Binary can represent anything any other number system can represent. It’s also waaaaay easier to have only two “digits” or states to deal with.
Current computers only really need to distinguish between “0” (no electricity) and “1” (electricity). If you start working with more complicated number systems, you end up having to measure the actual value of the electricity running through your system, rather than just if it’s on or off.
Even more, computers don’t deal with numbers as much as they do logic. In logic, you only need two values, “True” and “False”. Adding extra values, or “digits” to your computer gets redundant for a lot of what the computer is trying to do: follow logic that the designers created it for.
I don’t like saying “computers only understand 0s and 1s” it’s technically not true, computers don’t understand that either. They just understand different voltage signals, anything below a certain voltage is treated by the internal circuitry as a “low” of “off” value, and anything above another threshold is “high” or “on”
Since they can only understand two digital values, the most logical thing to implement is binary, which we do by creating logic that can treat “off” and “on” as 0 and 1 digits in binary, and perform operations with binary numbers represented as voltage values, but again at no point a computer knows anything, it’s just wired by us to treat voltage like we treat the digits 0 and 1
There’s two answers:
1. Analog computers exist. They just aren’t particularly common.
2. You could build a computer that uses some other number base, it would just be more complicated. Building a physical circuit that takes one of only 2 states (logically interpreted as 0 or 1) is relatively simple (e.g a semiconductor with only 1 bandgap between two amounts of energy). But doing the same thing to distinguish 3 states (0 , 1, or 2) requires more complex circuitry. This problem is amplified when using mechanical systems (e.g. vacuum tubes) to implement distinguishable states.
You could use any arbitrary range of numbers/values if you wanted. It just gets significantly more difficult.
0 and 1 is essentially off and on. Or low and high.
In terms of computers this, at its most basic level, is usually a voltage. Say 5V high and 0V low. Very easy.
But small variations might mean it’s actually 4.2V and 0.3V. no big deal, that’s close enough.
If you had three options then maybe it would be 0, 2.5 and 5. That’s less room for error and small variations could feasibly change a value.
For technical reasons it’s also much harder to reliably split a voltage rather than just give the full whack or nothing.
So 0 and 1 is generally the easiest and most reliable
We definitely can make computers that use more. The way normal computers work is that they treat a “high” voltage as 1 and a “low” voltage as 0. We define some cut-off value, and then we just make sure all the voltages are comfortably above or below that to avoid mistakes. We’re actually treating an analogue signal as a binary one. In other words, the voltage could take any value between actual 0 and some maximum, but we just care about high or low because it’s simpler.
The original design for a computer, Charles Babbage’s Computational Engine, actually worked as a decimal computer, taking the values 0,1,2,3,4,5,6,7,8,9. But this never got made. Then there are [analogue computers](https://youtu.be/IgF3OX8nT0w), which goes even further.
Latest Answers