– why are there 1024 megabytes in a gigabyte? Why didn’t they make it an even 1000?

1.30K viewsOtherTechnology

– why are there 1024 megabytes in a gigabyte? Why didn’t they make it an even 1000?

In: Technology

22 Answers

Anonymous 0 Comments

The hardest thing to wrap your head around here is that there is nothing at all special or “even” about powers of 10 (10, 100, 1000, etc.) *except that* humans have 10 fingers.

Computers have 2 fingers.

Anonymous 0 Comments

It’s already been pointed out that computer memory is calculated at powers of 2 due to the binary nature of the device.

However, in marketing copy for storage devices they generally “round down” to thousands instead of using the memory standard.

Anonymous 0 Comments

Answer: pretty much what everyone has said.

Actual answer: It **is** 1000.

What we call a megabyte, what **everyone** calls a megabyte is actually technically a Mebibyte. Mega is an SI prefix meaning 1000, but computers work in base 2 and so 1000 isn’t a ‘nice even number’ in binary.

They were called megabytes because it’s ‘around 1000’ bytes. When SI naming was decided, Mebibyte (mega binary byte) was chosen to differentiate between the standard of 1000 = Mega naming system.

Anonymous 0 Comments

Because it was created international standard called “ISO/IEC 80000” of 8-bit so everything goes 8, 16, 32, 64, 128, 256, 512, 1024.

Is a little bit hard to understand why because you kind of need to be a little bit more than a techsavvy but you have everything here: https://en.m.wikipedia.org/wiki/ISO/IEC_80000

L.E.: in older ages like 1960 they used different memory types of 12, 18, 24, 30, 36, 48, or 60 bits and it made programming a nightmare there is nothing compatible with CPU besides that piece of programm created corespondingly for that type of memory, for example the 12 bits programm would not work on an 18 bits programm but the CPU would bot understand anything either between this types of memory so they reated and standard international thing.

Anonymous 0 Comments

Computers count in powers of two.

1024 is a round number in computer speak. 1000 is not.

More importantly the way we make memory for computers and connect them to computers means that they tend to come in sizes that are powers of two.

This means that you ended up early on with sizes like 65536 Byte.

You could have simply used the k = 1000 meaning used everywhere else, but that would mean you would either have to round the true number to 65.5k.

However when you use 1KB = 1024 Byte. 65536 Byte are exactly 64KB.

You could be exact and had a short way to write things down at the same time.

Anonymous 0 Comments

Because it’s a power of 2.

But in practice, people just use 1000 anyways. Check bits and other overhead consumes some small % of the byte anyways, and when you scale up to the size of a gigabyte it’s not worth figuring out how much usable memory you have. And besides, we’re talking about a 2% difference, which for conversation doesn’t matter. The computer will do all it’s compression & what not and spit out the real value if you need to the real value, but for conversation, 1000 individual megabyte folders are just a gigabyte. Not “97% of a gigabyte”

Anonymous 0 Comments

Because 1024 is a power of two.

Computers do everything in powers of two because that’s the only way to represent numbers as a series of switches. It goes back to computers being made out of transistors, which are basically switches controlled by current. And those switches can be used to control other switches, etc. A computer is basically just a big pile of switches organized to perform complex functions. You feed numbers into that pile of switches by turning switches on and off (yes this is a massive oversimplification). And yes, everything is numbers. Text is numbers. Images are numbers. It’s all numbers. And they’re all base 2 numbers. Even base 10 numbers are represented internally as base 2.

Anyway, the reason memory sizes are powers of two is because to refer to a location in memory you need to say where in the memory something is and that something is a number. A number in base 2.

Anonymous 0 Comments

>even 1000

Because, believe it or not, 1024 is more even than 1000. You can divide 1000 by 2 only three times and you will reach an odd number (1000/2 = 500, 500/2 = 250, 250/2 = 125 odd).

On the other hand you can divide 1024 by 2 10 times and you will reach the smallest positive number there is by doing it. 1024/2 = 512, 512/2 = 256, 256/2 = 128, 128/2 = 64, 64/2 = 32, 32/2 = 16, 16/2 = 8, 8/2 = 4, 4/2 = 2, 2/2 = 1.

Anonymous 0 Comments

Technically a Gigabyte is equal to 1000 megabytes.

The correct terminology is 1 Gibibyte which holds 1024 mebibytes.

But the non IT World does not use the “correct” form.

Anonymous 0 Comments

There’s actually something called a Gibibyte (GiB).

A gibibyte and a gigabyte are sometimes used as synonyms, though technically they do not describe the same amount of capacity. They are close in size, however. A gibibyte is equal to 230 or 1,073,741,824 bytes. A gigabyte is equal to 109 or 1,000,000,000 bytes. One gibibyte equals 1.074 gigabytes. That’s a 7% difference.