– why are there 1024 megabytes in a gigabyte? Why didn’t they make it an even 1000?

1.34K viewsOtherTechnology

– why are there 1024 megabytes in a gigabyte? Why didn’t they make it an even 1000?

In: Technology

22 Answers

Anonymous 0 Comments

Computers are, at base, a bunch of switches that can be on or off.

If you have one switch you have two options 0 (closed) or 1 (open).

If you have two switches you have four (00, 01, 10, 11).

As such, powers of 2 come up a lot and 2^10 = 1024.

Anonymous 0 Comments

Why would you make it 1000 when you can make it an even 1024?

Computers work with nothing but switches that can be on or off. A single switch gives you 2 possibilities, 2 switches has 4 possibilities, 3 switches has 8 possible combinations and so on through powers of 2. 1024 is a power of 2, it’s the number of possible values you can represent with 10 switches.

Anything that is a power of 2 is much easier to work with for computers, just like powers of 10 are easier to work with for humans that work with a system based on “10”. Computers are based on “2”, hence is the actual even number that makes sense.

Anonymous 0 Comments

On top of what others explained, there’s a base 10 unit of measurement and a base 2 system, the standard kilo/mega/giga/tera is a base 10 system

The 1,024 is on the kibibytes, mebibyte, gibibbyte, tebibyte etc which is the binary byte system (for example kibibyte stands for Kilo Binary Byte)

Anonymous 0 Comments

1000 is a nice even number in base-10. It is 10^(3).     

1024 is a nice even number in base-2. It is 2^(10).   

 Computers work in base-2. 

Anonymous 0 Comments

In base 10 like most people use, 1000 is a nice even number. But 1024 is nice and even in base 2 (binary that computers use). In binary it’s 10000000000.

Anonymous 0 Comments

Computers work in binary or base 2 not decimal (base 10). 1024 is a power of 2 1000 is not. So in the case of computers or any base 2 system, 1024 is the even number and 1000 is the oddball

Anonymous 0 Comments

Computers work in binary. 10000000000 (1024) is a nice number in binary, 1111101000 (1000) isn’t.

Anonymous 0 Comments

This is slightly not what was asked, but there are actually 1000 megabytes in a gigabyte. There are, however, 1024 mebibytes in a gibibyte, but the standard 1000x multiplier names are very often incorrectly used in place of those.

Anonymous 0 Comments

True geek answer – there really are 1,000,000,000 bytes or 1,000 megabytes in a gigabyte. At least there are these days. The older use of a base-2 binary that resulted in 1,024 megabytes in a gigabyte was replaced years ago by the term gibibyte. There are 1,024 mebibytes in a gibibyte.

The 1,024 thing is like others have already replied so no need for me to repeat it.

Anonymous 0 Comments

I want to express this a bit more simply than other posters using powers of…

Start with one switch, this gives two options (on or off)

Add a second switch now you have four options (on/on, off/off, on/off, off/on)

If you keep adding switches, each time you double the number of options you have so the sequence goes:
2, 4, 8, 16, 32, 64, 128, 256, 512, 1024

Which is why we have 1024 megabytes, not 1000 – it’s difficult to get exactly 1000 options just using switches, it’s easy to get 1024 options.