Why do CPU’s and GPU’s need exponentially more power for marginal gains?

150 views

For example, the 7950x at 230w is only 6.5% faster that at 105w. Why is all that extra power being wasted so much?

In: 6

4 Answers

Anonymous 0 Comments

Well, I’m not sure what data you’re looking at, but usually in these sorts of things it isn’t “The CPU can draw 230W”, it’s “The CPU can draw up to 230W”. So you might not be able to assume that going from 105->230W actually represents a massive increase in power.

There is also sometimes an issue with the “limits” being more suggestions. So just because you set the system to “105 W” doesn’t mean the CPU is actually drawing 105 Watts or less.

There is then a 3rd issue that there are a few different ways to measure “watts”. 105W may be a TDP (thermal design power) and the 230W a “power-to-the-socket”, the latter of which is generally a significantly larger number.

You are viewing 1 out of 4 answers, click here to view all answers.