Why Do Data Centers Need So Much Water?

Data centers need water primarily to cool the thousands of servers inside them. Every computer generates heat, and when you pack tens of thousands of them into a single building, the combined heat output is enormous. Water absorbs heat far more efficiently than air does, making it the most practical way to keep servers at safe operating temperatures. Without adequate cooling, processors overheat within minutes, throttle their performance, and eventually fail.

How Servers Generate So Much Heat

Every calculation a computer chip performs converts electricity into heat. A single server rack in a typical data center draws around 1.2 kilowatts of power, and nearly all of that energy eventually becomes thermal waste. Modern AI-focused racks can draw 30 to 50 kilowatts or more. Multiply that by hundreds or thousands of racks in a single facility, and you get a building that produces heat on an industrial scale.

Server rooms must stay within a narrow temperature band, generally between 64°F and 80°F at the air intake. If temperatures climb above that range, chips begin to throttle themselves to prevent damage, slowing down the workloads they’re running. Sustained overheating shortens the lifespan of processors, memory, and storage drives. Facilities also need to control humidity carefully. Too dry, and static electricity can damage components. Too humid, and moisture can corrode the copper and silver contacts inside equipment. Industry guidelines recommend keeping dew points between roughly 27°F and 59°F, with relative humidity ideally below 50% to 70% depending on local air quality.

Where Water Fits Into the Cooling Process

The most common cooling approach uses evaporative cooling towers. Warm water circulates through the data center, absorbing heat from the air around the servers or from cooling coils. That heated water then flows to a cooling tower outside the building, where it’s sprayed into a stream of air. A portion of the water evaporates, and that evaporation carries the heat away, just like sweat cooling your skin. The cooled water loops back into the building to absorb more heat.

This evaporation is where the water actually gets “used up.” The water that turns to vapor is gone. Facilities must continuously replenish it. Open-loop systems, where the cooling water makes direct contact with outside air, lose the most water through evaporation and also require chemical treatments to prevent bacteria and mineral buildup. Closed-loop systems keep the primary cooling water sealed inside pipes and use a separate spray of water on the outside of the heat exchanger to handle evaporation. They consume less water overall and stay cleaner internally, but they’re slightly less efficient at transferring heat.

Some data centers in cooler climates can rely on outside air alone for part of the year, a technique called free cooling. But even these facilities typically switch to water-assisted cooling during warmer months, because air on its own simply can’t move enough heat fast enough when outdoor temperatures rise.

Why AI Is Making the Problem Worse

Traditional air cooling has worked for decades. Fans push chilled air through rows of servers, and that’s enough when each rack generates a modest amount of heat. But modern AI chips, designed for machine learning and large language models, pack far more processing power into the same physical space. They generate heat densities that air cooling struggles to handle. Hot spots form in the rows between cabinets, and no amount of fan speed can keep up.

This is pushing the industry toward direct liquid cooling, where water or another coolant flows through cold plates mounted directly on the hottest chips. The liquid absorbs heat right at the source, which is dramatically more efficient than waiting for that heat to radiate into the surrounding air. These systems use less total energy for cooling and allow data centers to pack more computing power into less space. But they still need water as part of the cooling loop, and the growing number of high-density AI facilities means total water demand keeps climbing.

A more radical approach is immersion cooling, where entire servers are submerged in tanks of a special nonconductive liquid, typically a synthetic hydrocarbon. The fluid doesn’t conduct electricity, so it won’t short-circuit the components, but it absorbs heat much more effectively than air. Immersion cooling can significantly reduce or even eliminate the need for water-based evaporative towers, though it introduces its own costs and complexity. It remains a small fraction of the market today.

How Much Water Data Centers Actually Use

The numbers are striking. A large data center can consume up to 5 million gallons of water per day, roughly the same as a city of 50,000 people. Google reported using more than 5 billion gallons across all its data centers in 2023. A single Meta facility in Newton County, Georgia, draws 500,000 gallons daily, which represents 10 percent of the entire county’s water supply.

The trend is accelerating. U.S. data centers collectively used about 5.6 billion gallons of water in 2014. By 2023, that figure had tripled to approximately 17.4 billion gallons, with 84 percent of that consumption concentrated in the largest “hyperscale” facilities run by companies like Google, Microsoft, Amazon, and Meta. Google’s water consumption alone rose roughly 20 percent from 2021 to 2022 and another 17 percent from 2022 to 2023. Microsoft saw even steeper increases of 34 percent and 22 percent over those same periods.

Water Stress in Local Communities

These facilities don’t draw from some abstract water supply. They tap into local municipal systems, rivers, and aquifers, sometimes in regions already facing scarcity. In The Dalles, Oregon, a city of 16,000 people, Google’s data centers consumed 355 million gallons of water in 2021, a full quarter of the city’s annual supply. Google initially resisted disclosing this figure, even paying $100,000 to fund the city’s lawsuit against a newspaper that filed a public records request for the data.

In the Phoenix metro area, nearly 60 data centers together demand about 177 million gallons a day. Meta’s facility in Goodyear, west of Phoenix, uses around 56 million gallons of potable water annually, equivalent to the household use of about 670 homes. While these numbers are small compared to agricultural water use in the region, they add new pressure in a desert already struggling with long-term drought and declining reservoir levels. Google has acknowledged that 31 percent of its freshwater withdrawals come from watersheds rated as medium or high water scarcity.

Recycled Water and Alternatives

Some companies are working to shift away from drinking water. Google uses reclaimed or non-potable water at over 25 percent of its data center campuses. Its facility in Douglas County, Georgia, runs entirely on recycled municipal wastewater. Amazon Web Services announced in 2023 that 20 of its data centers cool with purified wastewater instead of potable water.

These are meaningful steps, but they remain the exception. Most data centers worldwide still rely on potable water from local utilities. Switching to reclaimed water requires infrastructure to treat and pipe wastewater to the facility, which means cooperation with local governments and upfront investment that not every project includes. In water-scarce regions, even reclaimed water has competing uses, from irrigation to replenishing groundwater.

The gap between the industry’s water consumption trajectory and its conservation efforts is still widening. Total data center water use is growing faster than recycled water adoption, driven by the explosive demand for AI computing power. For communities hosting these facilities, the question isn’t just how much water data centers need today, but how much they’ll need five years from now as the next generation of chips runs hotter and the racks get denser.