Why do datacenters continuously use more water instead of recycling the same water in a closed loop system?

28 views
0

Why do datacenters continuously use more water instead of recycling the same water in a closed loop system?

In: 72

Because the water is used to remove heat. In order to carry the heat away, it must be either discharged or evaporated. If it’s just looped, it soon gets unusably hot.

I don’t know about data centers specifically, but some cooling systems do have a closed loop of water, but they cool one side of the loop using water that’s evaporated or discharged, or some other method.

It depends on what the water is being used for.

If it’s being used for cooling, it is recycled in a closed loop. The water comes in and pulls heat out of the cooling units, then gets pumped to outside radiators with fans blowing through them. That cools the water, then it’s pumped back inside, repeat.

However, datacenters also use a lot of water to control humidity, especially in dry regions. That water gets evaporated and added to the air, because super dry air conducts static electricity way too easy, and static electricity is bad for computers.

Evaporative cooling is the most cost-efficient way to cool a facility. It takes a lot of energy for water to go from (hot) liquid to gas, which means that a small amount of water being evaporated gets you a lot of cooling capacity.

However, the reverse is also true; when water goes from gas to liquid, it dumps that heat into everything around it. So if you’re using evaporative cooling, then you necessarily have to eject the gaseous water as well, otherwise you’re just cycling the heat from one part of the facility to another. But since you’re ejecting water from the system, you need to bring in more water to replace it. Hence, you’re a net “consumer” of water, as that water can’t be used anymore.

The alternative is to use a nearby river or waterway as a heat sink. You bring cool water in from the river, run it through the cooling system to bring it from cool to warm or hot, and then dump that water back into the river, further downstream. Again, you’re “consuming” water, except now you’re also heating up the local waterway, which could have unforeseen consequences on the local wildlife.

Depending on system, either evaporation is occurring- or the transfer of heat processes are causing minerals and particulates in water to eventually build up to the point where it will damage or corrode equipment. In either case, that build up must be discharged and treated. New water brought in. Repeat

I know of a building in NYC that used the water from the data center (lots of mainframes) to heat the offices in the building next door. Mostly a closed system as I understand it.

In the winter it was great for them. In the summertime they tried some scheme to use the hot water to generate supplimental electricity for the AC systems. Didn’t work out.

By 2012 the whole building has been renovated and no longer has that model. The datacenter were moved out of the area post 9/11.