Why did the console bit wars end? During the 32 bit era, PS1 and Saturn were 32 bit systems, and Nintendo was boasting about having a 64 bit system. The last time console makers boasted about bits in their system was the sixth generation, with the Dreamcast, GameCube, and PS2 being 128 bits.

816 views

Why didn’t the bit war continue into the seventh generation? Why didn’t the amount of bits double to 256 bits like they did in past generations? Any insight into this would be appreciated.

In: 609

34 Answers

Anonymous 0 Comments

The bits they’re referring to change over time, sometimes the bits refer to memory bus, sometimes vector instructions, but usually the CPU architecture.

We’ve had memory buses up to 512 bit and beyond for a while, with memory speed scaling linearly with speed and bitwidth. There’s even been 2048 bit memory buses.
We have 512 bit vector instructions which is typical packed 32 or 64 but instructions, and can go higher. We might occasionally need high bit width for things like cryptography but we usually have dedicated hardware for that.

The cpu bit width allows both default register / variable size as well as memory bit width, but 32 bit integers and floating point numbers are more than enough for games. 64 bit only mattered for memory addressing, 32 bit limits you to 4GB of ram. The N64 did have a 64 bit CPU but it didn’t need it or really use that functionality.

GPUs also have their own bitwidths since they use simd or vliw instructions too packing many 32 bit ops together. For many graphics operations, even 16 bit or 8 bit can be useful.

We typically scale other aspects of computer hardware than bitwidth since 32 bit was generally plenty for most computations. We scale clock speed, cores, instructions per clock cycle, and then for GPUs generally FLOPS while CPUs are more concerned with MIPS.

You are viewing 1 out of 34 answers, click here to view all answers.