Why do computers work in base 2, as opposed to base (higher number here)?

543 views

I realise (/think?) that CPUs essentially treat two different voltages as a 1 or 0, but what stops us from using 3 or more different voltages? Wouldn’t that exponentially increase the CPU’s throughput by allowing for decisions with greater than two outcomes to be calculated in one cycle? This would presumably mean that a LOT of stuff written for base 2 would need to be updated to base 3 (in this example), but I can’t imagine that’s the only reason we haven’t done this.

I feel like I’ve explained that poorly, but hopefully you get the gist.

In: 12

20 Answers

Anonymous 0 Comments

The reality is that computers don’t really work in base 2…

Each computer has a specific instruction set – which is the set of things that it can do. The number of instructions that it can do depends on the specific computer, but it will generally have instructions like:

add a, b

which means add the numbers a and b together. For a modern computer, the values a and b are numbers of a specific size – or specific number of bits. They might be 8 bits, 16 bits, 32 bits, or 64 bits.

So if they are 8 bits, each number can be from 0 to 255.

The computer process has an arithmetic unit that does the actual addition, but it does the addition all at once, not one bit at a time.

From a programmer perspective, you could write something like:

add 15, 3

add 00001111, 00000011

add 0x0F, 0x03

Those are just three ways of specifying the same number.

You are viewing 1 out of 20 answers, click here to view all answers.