Why did the console bit wars end? During the 32 bit era, PS1 and Saturn were 32 bit systems, and Nintendo was boasting about having a 64 bit system. The last time console makers boasted about bits in their system was the sixth generation, with the Dreamcast, GameCube, and PS2 being 128 bits.

644 views

Why didn’t the bit war continue into the seventh generation? Why didn’t the amount of bits double to 256 bits like they did in past generations? Any insight into this would be appreciated.

In: 609

34 Answers

Anonymous 0 Comments

*TL;DR* We got to a point where computers could show more shades of color than most people’s eyeballs could tell the difference between. We just kinda stopped using it as an advertising bullet point.

The advertised bits are the “[bit-depth](https://en.wikipedia.org/wiki/Color_depth)” of the color space. Basically how many individual colors the computer can display. When we had 8-bit color you could give every pixel on the TV screen a binary number that was its color. anything from 00000000 to 11111111. This makes 256 numbers in total. [Here is a picture of all of them.](https://upload.wikimedia.org/wikipedia/commons/9/93/256colour.png)

Same with a 16-bit image. More numbers allows for many more colors with finer differences between them. This is not just a number twice as big, it’s twice as many digits long. This is 65536 colors. [Here is a picture of all of them](https://upload.wikimedia.org/wikipedia/commons/d/d1/RGB_16bits_palette.png).

At 24-bit color we start hitting the edge of what computers could do for most of their history. We’d like 32-bit color but we need a few of those last bits to carry other information. not to worry. even with 24-bit that’s 16,777,216 individual colors. [Here is a picture of all of them.](https://upload.wikimedia.org/wikipedia/commons/e/e9/16777216colors.png)

Around about this time the video settings on Windows just said “millions of colors” for the last option on the settings menu. After that, we did start using 30 bits for color, allowing for 1,073,741,824 colors. (no picture this time) We are at a point where most monitors can’t display such fine differences in color and 99.9999% of people couldn’t tell or wouldn’t care that much even if it could.

Anonymous 0 Comments

There’s a lot of answers addressing the technical side of it, but I want to add something else since I lived through the whole console war and saw all the ads and hype.

After the SNES and Genesis, they started with the whole 32/64 bit marketing, with the Nintendo 64 being the most vocal about it in its name. But beyond the technical limitations of not needing to go beyond 64 bit processing, there’s another major reason why the marketing of it was dropped.

In 1994, the Atari released the Jaguar, a “64 bit” processor gaming console. You can look up the history of it for more detail, but to put it simply, it was a massive pile of crap. It also had some false advertising for a 64 bit console, since it only had a 32 bit CPU and used different components and technical tricks to kinda process up to 64 bits.

Anyway, it left a sour taste in gamers’ mouths and when you have a handful of companies wanting a piece of the next gen pie and a rapid community to feed it to, you steer away from any bad publicity.

When the Nintendo 64 was first being advertised, it was the Nintendo Ultra 64. They played up the 64 bit in a major way, but after the things like the Jaguar and other companies focusing so much on the bit, they shyed away from it. Other companies followed suit cause they either wanted to avoid similar negative marketing or they didn’t want to admit their competitors (namely Nintendo at the time) had a stronger system.

Of course, the focus on 3d graphics changed what people viewed was important or exciting in games by that gen too.

Anonymous 0 Comments

8 bits is really really limiting, having a max of 255 of things really visibly impacted games. Even a max of 32,767 of something is a limit you hit. Especially in memory.

After that, meh, it stops being the biggest barrier.

Anonymous 0 Comments

It’s just marketing terms stretching the definition of bitness. At some point if you bullshit too much then it’s not true anymore. People were claiming bitness based on memory bus.

Anonymous 0 Comments

Because 128 bits is way way way more than anyone can make productive use of. That would provide enough memory space to address every single molecule in the entire universe. 64 bits is plenty.

The mistake people make is that they think 128 bits is twice as big as 64 bits. That is wrong. If you have 2 bits, can address 4 things. If you have 3, 8 things. 4 bits, 16. 65 bits is twice as big as 64 bits, not 128. 128 bits is 18.5 quintillion times bigger than 64 bits.

Anonymous 0 Comments

Marketing bit off more than they can chew. The higher the number, the more they had to justify.

Anonymous 0 Comments

TL:DR – Sony blew everyone else away

Sega bailed after Saturn/Dreamcast, Nintendo tried to be different, and hold their niche market, MS threw money at the problem stubbornly with no soul, buoyed only by Halo.

It stopped being about bits a long time ago, or even the hardware itself. It’s became about platform exclusives, consumables, longer revenue tail and online play.

Which – really is saying the ecosystem with the most users wins. That’s why MS bought Activision – purely to buy traffic. Users you can sell other things too, and integrate as much of their entertainment, search and shopping data into a unified profile.

Sony won the user race 8 years ago. It’s not really been close since.

Anonymous 0 Comments

Sony won.

Sega lost.

Nintendo decided to quietly give up and just focus on making quality games again.

When Microsoft decided to challenge Sony, they did so via Exclusive titles rather than hardware, since the machines weren’t that vastly different.

Anonymous 0 Comments

For very short, the console bit war ended for 2 main reasons:

1) There is no point on going over 64 bits, as the power needed for the instruction calculation would be greater than the advantages of carries. Even 128 bit systems are actually 32 bit devices using parallel computation.

These bit indicate the lenghts of the single instruction that can be recalled by the CPU directly. Marekting gimmicks instead worked on showing how big was the system bus (the data highway connecting the main parts of the system).

2) Standardization of game development was a push for the end of the bit wars, and generally the hardware-wise console war. The push of using similar SDKs for the games on different devices to hasten the game publishing and development led to a “flattening” in the variance of the game inner electronics.

Anonymous 0 Comments

Basically, CPU bit width is a measure of how big the numbers it can handle at a time can be. Not just for math, but also RAM addresses, which they use to find data. Bits were important for early computers, where RAM increased very rapidly. However, as you might expect, there are some diminishing returns. Considering bit width as a measure of how much RAM the CPU can handle, 64 bit (what the majority of computers are on) can handle a very, very large amount of RAM, in the Petabyte range or beyond.

Tl:Dr it just became irrelevant as tech advanced.