Why didn’t the bit war continue into the seventh generation? Why didn’t the amount of bits double to 256 bits like they did in past generations? Any insight into this would be appreciated.
People already commented on how more than 128 bits is probably useless for addressing. While that was the main driving force for increasing instruction width in the past, wider instructions (256, 512 bits) are still useful for vector (simd) instructions.
Latest Answers