How come CPU and GPU can draw more way power than its specified TDP?


How come CPU and GPU can draw more way power than its specified TDP?

In: 4

TDP is basically just a safety guideline, it doesn’t mean the chip cannot physically go over that. It’s not even a standardized guideline between different manufacturers.

Its not really a physical maximum.

If you want to reduce the energy usage and the temperature have a look into the GPU driver. I reduced my GPU Temperatures by taking the average max FPS and reduce this FPS by 5, as result i had 20% less temperature (which is directly connected to the Energy Usage)

Dependent on the GPU and Driver there are several possible options

* Set Max FPS
* Set Max temp
* Set power target
* Underclock
* …

It makes no sense to set the Grafik options in a game, these say only “Deliver me the Grafic in this quality with all performance you can deliver” while the GPU settings in the driver say “Deliver so much work and then make a coffee Break”


For the CPU is more complicated

TDP on processors is a made up number that is whatever the manufactures set it to be. [Gamers Nexus: AMD Ryzen TDP Deep-Dive & What Cooler Manufacturers Think of “TDP”](

TDP = ( Temperature (CPU) – temperature (ambient) ) / heat sink thermal resistance

The equation above is real physics equation for thermal conductance. But the problem is that the CPU manufacturers use it wrong. They use it in reverse by defining temperatures and thermal esistances to be whatever they want and they get whatever number they want for the “TDP”.

For example TDP for Ryzen 5000 CPUs used different numbers than Ryzen 7000 CPUs. So a “65 W” 5600x consumes about 67 W while a “65 W” 7600 consumes 87 W ([Gamers Nexus benchmark](

TDP doesnt directly represent the chip’s power draw, insdtead its representing how much ” heat”(at heat is measurable in Watts) the chip can output safely that has to be dissipated.

a CPU’S TDP rating is basically a guideline saying ” your cooling solution must be able ot disperse this much heat”.

In the past, TDPs used to be a much more realistic measure of how much power a chip would actually consume than they are today.

Nowadays, TDPs typically refer to the amount of power the chip will consume at base clock speeds rather than the maximum actual expected power consumption (technically, it measures thermal dissipation and not power, but given that power is converted to heat, it represents the same thing). At boost clocks, chips will typically use more power for as long as temperatures will allow until the chip is forced to clock back down for thermal purposes.

It does make devices look better on paper, but it doesn’t really represent the actual maximum power consumption. It’s more of a “minimum guaranteed performance at a specified power level.” Most devices will allow their chips to use more power than their TDP might suggest as long as their cooling systems have been well-designed to dissipate the extra heat.