Why can’t the heat put out from servers (and similar electronics) be used to generate electricity/power?

152 views
0

If we can use heat sources as energy, why not use it to put power back into the system?

In: Technology

We cannot generate power from the heat without having a serious temperature difference between different parts of the generator. A generator operating on the excess heat of a server would be very inefficient and complex. It’s not worth building a bulky generator with the cost comparable to the cost of servers to get like 0.5% of electricity back.

Because to produce usefull energy you need something to be a lot hotter than the enviroment for it to be efficient.

The issue is the “Carnot Factor”. The amount of energy you can receive at the maximum is depending on the difference in temperature to the enviroment. The factor is (T_hot-T_cold)/T_hot if you use Kelvin

The means lukewarm server coolwater is rather inefficient and the hardware you need to extract the energy isn’t going to pay itself off in a hundred years.

For comparision a steam powerplant operates at 550K above the outside temperature usually, and reaches about 40% efficiency with that. At 70K (wich would be horribly hot for cooling a server) you could reach at most 20% if you had no losses at all.

Money. The only way to get power back would be thermo elements but they are expensive, take space and the generated power is low.

We’ve looked at it. But electricity generation isn’t a great use for it.

Modern electronics are designed to run no hotter than 100 degrees celsius, and even then they’re already thermal throttling (reducing power consumption to keep their temperature under control). Realistically they run in the 60 to 80 degree range at full load. So that’s the sort of temperature we’re working with here.

Water boils at 100. So that’s not hot enough to boil water, meaning a steam engine design is no good. Other heat-based engines, like a Stirling engine, honestly aren’t very good in terms of power output. If room temperature is around 20 degrees, that temperature difference is just too low for most engines to be useful. Furthermore the output would vary by how hard the electronics are working, and an inconsistent power source is a problem of its own.

There are other uses for the heat though. 60 degree water would be good for other applications, like heating a building during the winter, or kicking off general purpose hot water needs. There are datacenters whose heat output is sent to nearby buildings who could make use of it in that regard. So no need to burn natural gas for hot water or other indoor heating, or at least reduced need.

2nd Law of Thermodynamics.

You can never create a device with 100% efficiency.

The efficiency of an ideal heat engine depends on the heat difference, the heat from servers is mild compared to a thermal plant (and even those are quite inefficient)