what are 32 and 64-bits in computers and what difference do they make?

557 views

Does it make the computer faster? And how are they different from 8 and 16-bit video game consoles?

In: 2446

20 Answers

Anonymous 0 Comments

This question was asked 7 hours after you asked. I liked user Muffinshire’s explanation the most:

“Computers are like children – they have to count on their fingers. With two “fingers” (bits), a computer can count from 0 to 3, because that’s how many possible combinations of “fingers” up and down there are (both down, first up only, second up only, both up). Add another “finger” and you double the possible combinations to 8 (0-7). Early computers were mostly used for text so they only needed eight “fingers” (bits) to count to 255, which is more than enough for all the letters in the alphabet, all the numbers and symbols and punctuation we normally encounter in European languages. Early computers could also use their limited numbers to draw simple graphics – not many colours, not many dots on the screen, but enough.

So if you’re using a computer with eight fingers and it needs to count higher than 255, what does it do? Well, it has to break the calculations up into lots of smaller ones, which takes longer because it needs a lot more steps. How do we get around that? We build a computer with more fingers, of course! The jump from 8 “fingers” to 16 “fingers” (bits) means we can count to 65,535, so it can do big calculations more quickly (or several small calculations simultaneously).

Now as well as doing calculations, computers need to remember the things they calculated so they can come back to them again. It does this with its memory, and it needs to count the units of memory too (bytes) so it can remember where it stored all the information. Early computers had to do tricks to count bytes higher than the numbers they knew – an 8-bit computer wouldn’t be much use if it could only remember 256 numbers and commands. We won’t get into those now.

By the time we were building computers with 32 “fingers”, the numbers it could count were so high it could keep track of 4.2 billion pieces of information in memory – 4 gigabytes. This was plenty, for a while, until we kept demanding the computers keep track of more and more information. The jump to 64 “fingers” gave us so many numbers – 18 quintillion, or for memory space, 16 billion gigabytes! More than enough for most needs today, so the need to keep adding more “fingers” no longer exists.”

You are viewing 1 out of 20 answers, click here to view all answers.