AI datacenters may gulp NYC's daily water supply at peak • The Register
#Infrastructure

AI datacenters may gulp NYC's daily water supply at peak • The Register

Regulation Reporter
4 min read

New study warns US water systems will need billions in upgrades to meet datacenter cooling demands during hottest days, with peak usage potentially matching NYC's entire daily supply by 2030.

A new study from the University of California, Riverside, warns that America's water infrastructure faces a looming crisis as AI datacenters expand across the country. The research reveals that peak cooling demands from server farms could strain public water systems to their breaking point, potentially requiring water capacity equivalent to New York City's entire daily supply during the hottest days of the year.

The study, published by researchers at UC Riverside, acknowledges that water remains one of the most efficient cooling methods for datacenters seeking to minimize power usage. However, it highlights a critical disconnect between annual water consumption patterns and the massive spikes in demand that occur during peak cooling periods.

According to the report, datacenters across America may require between 697 million and 1.45 billion gallons of additional peak water capacity per day by 2030. This range represents a staggering comparison to New York City's average daily water supply of approximately one billion gallons. Even under the most optimistic projections for water use reduction, the new capacity required could amount to half of New York's supply for most of the year.

Featured image

The cooling challenge stems from how datacenters manage heat. The process typically occurs in two stages: first, server-level cooling transfers heat from IT equipment to facility-level heat exchangers through air-based or closed-loop liquid cooling systems. This initial stage rarely involves direct water consumption. The second stage, facility-level cooling, transfers heat from the datacenter to the outside environment and is where water usage becomes significant.

Depending on the technology employed, facility-level cooling may use cooling towers that rely on evaporation, air-cooled systems supplemented by direct evaporation, or adiabatic cooling to reduce peak power demand during hot weather. A large server farm using evaporation cooling can consume millions of gallons of water per day during the hottest periods—significantly more than during cooler times.

The study emphasizes that while not all water withdrawn by datacenters is "consumed" through evaporation or removal, any water taken in by a server farm becomes unavailable for other users. This creates the core problem: peak demand periods when both power grids and water systems face maximum stress simultaneously.

America's water infrastructure presents unique challenges for datacenter expansion. The country has roughly 50,000 community water systems, with approximately 40,000 serving no more than 3,300 people each. Only 708 systems serve populations exceeding 100,000. Nearly all hyperscale and colocation facilities draw water from these community systems, primarily from potable sources, with only a few relying on private groundwater.

These public water systems are designed to meet maximum demand reliably at all times, with additional margins for extreme conditions like prolonged heatwaves and droughts. However, the study finds that many systems may struggle to support even a 100 MW IT load requiring water-based cooling. Depending on local climate conditions and cooling system design, such a facility could need approximately 0.5 to 2.5 million gallons per day of water capacity.

Gigawatt-scale AI facilities currently being planned for deployment across America would require even more substantial water resources. The researchers note that many datacenter projects have already required significant upgrades to local water infrastructure, even when peak water demand was as low as 0.1 million gallons per day.

Overall, the report concludes that US server farms are projected to require 697 to 1,451 million gallons per day of new water capacity, at a potential cost of up to $58 billion. This investment would be comparable to New York City's average daily water supply infrastructure.

The authors recommend several strategies to address this growing challenge. First, they suggest that datacenter operators report peak water use rather than just yearly averages to aid in planning and infrastructure development. Second, they recommend partnerships between datacenter operators and local communities to fund water infrastructure upgrades jointly.

A particularly innovative recommendation involves closer coordination with utilities, where datacenters would use water-based cooling when the power grid is stressed but switch to dry cooling when the community water system is stressed. However, the report acknowledges the complexity of this approach, noting that no clear solution exists for days when both systems face peak stress simultaneously.

The study arrives amid growing concerns about datacenter expansion across the United States. Recent developments include pushback from local communities against new datacenter construction, efforts by tech companies to develop alternative cooling technologies, and increasing scrutiny of the water-energy nexus in infrastructure planning.

As AI development accelerates and demands for computing power continue to grow, the intersection of water resources and datacenter cooling will likely become an increasingly critical issue for policymakers, utility companies, and technology companies alike. The UC Riverside study serves as an early warning signal that the water infrastructure supporting America's digital future may need substantial investment and innovative management approaches to meet the challenges ahead.

Comments

Loading comments...