Why do CPU’s and GPU’s need exponentially more power for marginal gains?

157 views

For example, the 7950x at 230w is only 6.5% faster that at 105w. Why is all that extra power being wasted so much?

In: 6

4 Answers

Anonymous 0 Comments

Imagine running a kitchen at a restaurant with say 5 chefs. Business is going well and you’re feeding say 25 customers an hour. Let’s say you want to start scaling up and serving more customers. Simple enough, you hire an additional chef and that solves the problem, you can now serve 30 customers in the same time.

Now business is booming and maybe a year down the line you think you can do the same thing: ” if I hire a new chef, I can feed 5 more customers an hour”. So you try it and find that you’re only feeding maybe 1 or 2 more customers an hour. Problem is, there’s not enough counter space for all the chefs and not enough pots and pans and cook tops. Those need to be scaled as well. You also need new servers to keep up with the orders, and a system to keep track of which chefs should cook which meals, etc.

The point is, to get the same gains you saw in the past, you need to do more than hire an additional chef.

Likewise for CPUs and GPUs. If it were as simple as more cores equals more performance, it would be the case that power requirements scale linearly, but that’s not the whole picture. You need additional hardware/firmware to make sure all the cores can do their job. This gets way more difficult when you try to fit them all into the same space and even worse when you want to make sure they’re not making mistakes.

You are viewing 1 out of 4 answers, click here to view all answers.