Why would you make it 1000 when you can make it an even 1024?
Computers work with nothing but switches that can be on or off. A single switch gives you 2 possibilities, 2 switches has 4 possibilities, 3 switches has 8 possible combinations and so on through powers of 2. 1024 is a power of 2, it’s the number of possible values you can represent with 10 switches.
Anything that is a power of 2 is much easier to work with for computers, just like powers of 10 are easier to work with for humans that work with a system based on “10”. Computers are based on “2”, hence is the actual even number that makes sense.
True geek answer – there really are 1,000,000,000 bytes or 1,000 megabytes in a gigabyte. At least there are these days. The older use of a base-2 binary that resulted in 1,024 megabytes in a gigabyte was replaced years ago by the term gibibyte. There are 1,024 mebibytes in a gibibyte.
The 1,024 thing is like others have already replied so no need for me to repeat it.
I want to express this a bit more simply than other posters using powers of…
Start with one switch, this gives two options (on or off)
Add a second switch now you have four options (on/on, off/off, on/off, off/on)
If you keep adding switches, each time you double the number of options you have so the sequence goes:
2, 4, 8, 16, 32, 64, 128, 256, 512, 1024
Which is why we have 1024 megabytes, not 1000 – it’s difficult to get exactly 1000 options just using switches, it’s easy to get 1024 options.
Latest Answers