ELI5- Why are Graphics cards getting so big & power hungry when other electronics are getting smaller and use less power?

219 views

In the past we put graphics cards in a PC now we put a PC in a space heater

In: 14

10 Answers

Anonymous 0 Comments

Gamers want performance. Games want performance. And then things really got crazy and graphics cards also started doing ray tracing, video decoding/encoding, AI stuff, and some productivity jobs.

Features and performance consume power. Lots of features and lots of performance consume lots of power. And the harder you push the hardware, the harder physics pushes back and you need to do things like increase the voltage to compensate and that’s just MORE power and heat!

Anonymous 0 Comments

Computing power has advanced to the point that most of the things the average person want to do only take a fraction of the available power. So you can have a small processor that isn’t pushed very hard, making it run cool and use less electricity

Gaming graphics are an exception. They are still pushing levels that even the best hardware available can struggle (at hugest resolutions/quality). So there is a market for the biggest, fastest processors. These draw lots of electricity and generate a lot of heat, necessitating big heat sinks and fans which make the cards even bigger

Anonymous 0 Comments

The Graphics Card did not truly get much bigger. What makes them big is the cooling. The Watercooled cards are actually still pretty small.
You have incredibly high number of transistors to do the calcuations and they need space. Eventually the production method decreases the size of a transistor like from 5nm to 3nm. Then you can have a lot more processing power on a smaller area.
But all that power needs energy, as our computing system is based on electricity. So you need more and more energy to do all these calcuations, but they are now so efficient they rarely use all the power that they could use.
Old cards used what they were offered. New cards will use as much as necessary and won’t go for 500W for calculating 1+1=2.

Anonymous 0 Comments

They’re reaching the limits of how small individual transistors can be with current methods.

Once you can’t go smaller and denser, more compute power means bigger chips and or more of them

Which ups the power needed

Which ups the cooling needed

All of which means more space needed

Anonymous 0 Comments

Screens are getting bigger (4k) and faster (144Hz). So you need more precision and faster. So you need more computing power, which leads to bigger and more power hungry graphic cards.

Anonymous 0 Comments

Because unlike other forms of compute, 3D graphics that are trying to represent reality have an almost unlimited budget. There’s a point at which your excel spreadsheet calculates fast enough to not care about more speed. But you can always throw more objects on screen, more detailed rendered, with ray tracing and physics and all that. In theory that too will top out, but it’s a LONG way off. And the cost to build these more detailed worlds are pretty low so the ability of developers to build that performance demand into games is pretty cheap and easy. In fact, in a lot of ways, it’s less work than trying to avoid it.

Computing over time has been constrained in many ways and for most general purpose stuff that people need to do, pretty cheap hardware is now more than adequate allowing it to shrink in size and power needs. But ‘adequate graphics’ are a ways off yet. Though UE5 looks like a promising step in reducing a few more constraints there.

Anonymous 0 Comments

The root of the problem is the death of Moores Law. For the last few decades you could get big jumps in performance simply by shrinking the processes node. Now we’ve reached a point if diminishing returns. However, gamers still expect that big jump in performance each iteration so the only way to achieve it is by making more power hungry and expensive cards.

Anonymous 0 Comments

Graphics cards are staying about the same size, it’s the cooling hardware that is so big and bulky. Non-bulky versions of graphics cards are put in things like gaming laptops and just like CPUs they are roughly the size of a large thumbnail. However the laptop versions still need to be cooled rapidly which is why gaming laptops usually have two large fans. Same cooling but just configured differently.

Graphics cards use a lot of energy to compute things so fast when they’re under load because transistors take energy to turn on and off. The more they’re doing so the more they heat up. This requires a lot of cooling. CPUs and GPUs are designed to run best at a higher temperature than room temperature but once they reach 90-100 degrees Celsius they can start to damage themselves if they are not cooled properly.

Anonymous 0 Comments

They have actually been getting more power efficient and smaller as well. I assume you’ve seen the rtx 4090 but most if what you’re seeing on any modern gpu is the cooler not the actual gpu die. Those have generally been trending downward in size. The reason power usage went up is because at least as far as nvidia is concerned they resorted to doing a pretty big increase in core count to up the performance. If you did a one to one comparison to the last generation it would be more efficient. The end result though is you need a beefy cooler to keep it running at stable temps.

Anonymous 0 Comments

Graphics cards stay the same size , it’s the computing power and the capability that needs way more cooling than older graphics cards. The boards on these cards are usually the same size, unless the amount of the VRAM increases, but the cooling units (fans, distributors etc.) always increase with the computing power.