Why do datacenters continuously use more water instead of recycling the same water in a closed loop system?

341 views

Why do datacenters continuously use more water instead of recycling the same water in a closed loop system?

In: 72

14 Answers

Anonymous 0 Comments

Because the water is used to remove heat. In order to carry the heat away, it must be either discharged or evaporated. If it’s just looped, it soon gets unusably hot.

I don’t know about data centers specifically, but some cooling systems do have a closed loop of water, but they cool one side of the loop using water that’s evaporated or discharged, or some other method.

Anonymous 0 Comments

Evaporative cooling is the most cost-efficient way to cool a facility. It takes a lot of energy for water to go from (hot) liquid to gas, which means that a small amount of water being evaporated gets you a lot of cooling capacity.

However, the reverse is also true; when water goes from gas to liquid, it dumps that heat into everything around it. So if you’re using evaporative cooling, then you necessarily have to eject the gaseous water as well, otherwise you’re just cycling the heat from one part of the facility to another. But since you’re ejecting water from the system, you need to bring in more water to replace it. Hence, you’re a net “consumer” of water, as that water can’t be used anymore.

The alternative is to use a nearby river or waterway as a heat sink. You bring cool water in from the river, run it through the cooling system to bring it from cool to warm or hot, and then dump that water back into the river, further downstream. Again, you’re “consuming” water, except now you’re also heating up the local waterway, which could have unforeseen consequences on the local wildlife.

Anonymous 0 Comments

It depends on what the water is being used for.

If it’s being used for cooling, it is recycled in a closed loop. The water comes in and pulls heat out of the cooling units, then gets pumped to outside radiators with fans blowing through them. That cools the water, then it’s pumped back inside, repeat.

However, datacenters also use a lot of water to control humidity, especially in dry regions. That water gets evaporated and added to the air, because super dry air conducts static electricity way too easy, and static electricity is bad for computers.

Anonymous 0 Comments

Depending on system, either evaporation is occurring- or the transfer of heat processes are causing minerals and particulates in water to eventually build up to the point where it will damage or corrode equipment. In either case, that build up must be discharged and treated. New water brought in. Repeat

Anonymous 0 Comments

I know of a building in NYC that used the water from the data center (lots of mainframes) to heat the offices in the building next door. Mostly a closed system as I understand it.

In the winter it was great for them. In the summertime they tried some scheme to use the hot water to generate supplimental electricity for the AC systems. Didn’t work out.

By 2012 the whole building has been renovated and no longer has that model. The datacenter were moved out of the area post 9/11.

Anonymous 0 Comments

Depends on the area as well. There are a lot of water restrictions for data center cooling where we live and also some data center owners don’t want water in the facility. Most of the data center designs I work on are air-cooled.

Anonymous 0 Comments

They’re not *using up* the water. They’re using it to cool their machines, and returning it to the environment slightly warmer than it was before. Here’s an example of a relatively modern datacenter cooling setup — not the newest (it’s from ~10 years ago), but one that uses seawater, which is somewhat unusual.

https://www.datacenterdynamics.com/en/news/googles-finland-data-center-pioneers-new-seawater-cooling/

Google’s Finland datacenter takes in cold seawater and uses it to cool the fresh water that’s then used to cool the computers. They then mix the slightly-warmed seawater with other cold seawater before returning it to the ocean.

Anonymous 0 Comments

I don’t know of a data center that uses open loop cooling (dumping heated water).

What I do know of is evap cooling. In this case, you have a closed loop, with radiators outside. Those radiators operate passively (well, with fans) for most of the time. You can spray water on the radiators, and the evaporating water will help cool down the loop. (In these systems there are still usually chillers, to bring the water down to the correct temperature, but letting radiators do most of the work ahead of time is more energy efficient)

What I could see working, though I don’t know of any facilities that do this, is having river water or ocean water pumped over those radiators. Similar to nuclear plants. In this case the liquid doing the cooling is still in a closed loop, but is using another body of water to bring it’s temperature down. Depending on the location, chillers may not be needed either.

Anonymous 0 Comments

Is their water use consumptive or non-consumptive? In other words if they are taking cool water from a source and putting warm water back into that source it could be considered that they are using the same water over and over again.

Anonymous 0 Comments

Around 1980 the company I worked for used the cooling water from our big IBM mainframes to heat the building.