The information and communication technology sector is expanding rapidly, with connected devices expected to rise to 29.3 billion up from 18.4 billion in 2018. This surge in internet use, combined with the explosive growth of AI, is fuelling unprecedented demand for data center infrastructure that require significant cooling capacity. That dependence on water is becoming a growing vulnerability as climate change intensifies regional drought conditions.
Tech companies are projected to spend $375 billion on data centers this year, increasing to $500 billion by 2026. Yet this expansion carries mounting environmental risk. An assessment of 9,055 facilities indicates that by the 2050s, nearly 45% may face high exposure to water stress.
To build data centers sustainably without worsening water scarcity, we must understand why cooling is critical and how it shapes their overall water footprint. This article outlines why data centers need cooling, how current systems operate, and the key challenges in the transition to sustainable data center cooling solutions.
Why do data centers need cooling?
Demand for data centers continues to rise as AI, cloud computing, and data-intensive applications expand. This growth is driving sharp increases in power use across facilities. In early 2024, most centers supported rack power loads of 20 kW or more. Average rack density is expected to grow from 36 kW in 2023 to 50 kW by 2027, with denser racks requiring more water to cool advanced AI chips.
New server designs are integrating larger clusters of power-hungry processors to meet high-performance computing needs. At Nvidia’s 2025 GTC, Nvidia CEO Jensen Huang announced that upcoming racks like the Rubin Ultra NVL576 expected in 2027, could consume power up to 600 kW. Such levels will place unprecedented pressure on cooling systems.
Cooling accounts for 30% to 40% of total data center energy use to maintain stable operating temperatures and prevent equipment failure. The rapid expansion of data centers is driving a sharp rise in water demand. As chip density increases, heat loads surge, pushing operators to depend more heavily on water-based chillers and cooling towers to ensure consistent performance. A recent report projects that annual data center water consumption in the United States could double or even quadruple by 2028 to roughly 150–280 billion liters of water per year compared with 2023 levels putting additional pressure on already stressed regional water systems.
How does a typical data center cooling system work?
Modern data centers have moved beyond traditional HVAC toward specialized cooling systems designed for high-density computing. Air cooling has been the dominant method, but rising chip power and server density are making it insufficient. Standard air-based systems rely on computer room air conditioning (CRAC) units, which in some cases consume more electricity than the servers themselves. As heat loads climb, air can no longer remove enough heat from densely packed racks.
This shift is driving rapid adoption of liquid cooling. Water absorbs heat 3,000 times more effectively than air, making it far better suited to cooling. Most facilities depend on freshwater for its consistent quality, minimal mineral buildup, corrosion, and biological growth that could damage equipment.
However, this reliance carries environmental and social costs. Freshwater is limited, and many data centers operate in regions already facing scarcity. Using potable water for cooling can compete directly with local community needs, triggering growing public backlash in the United States, Chile, Ireland, the Netherlands and other countries.
How is water used in a data center?
Data centers draw water across three main categories:
- Scope 1 – Direct on-site cooling:
Water used in chillers, cooling towers, and liquid cooling systems to manage server heat. - Scope 2 – Water for electricity generation:
Most electricity still comes from thermoelectric power plants that require significant water for steam production and cooling. This is the largest contributor to a data center’s total water footprint.
- Scope 3 – Supply-chain water use:
Water required for manufacturing servers, semiconductor chips and other electronic equipment.

Rising AI workloads generate far more heat than conventional cooling can handle efficiently. This is intensifying the shift toward solutions that cut or eliminate freshwater use. Key approaches include closed-loop cooling, alternative coolants, reclaimed water, dry cooling, and systems that rely on outside air or local climate conditions. Many of these methods are proven in other industries, but applying them at scale in data centers remains challenging.
Challenges in current data center cooling solutions
1. Rising power demand is overwhelming regional grids
Data center electricity demand in the United States is projected to grow by about 460 terawatt-hours between 2023 and 2030. This is roughly three times the current level of consumption. As data centers increase in number with larger gigawatts of power, more power is needed for cooling, placing a higher demand on the grid. The pressure is most acute in regions where transmission capacity is already constrained.
Meeting this load requires significant new generation and expanded transmission infrastructure. Yet these projects move far slower than new data center development. This gap intensifies grid stress and complicates state decarbonization plans. Cooling systems add further strain because they draw substantial power year-round. Unless renewable energy scales rapidly, rising cooling demand risks slowing progress toward long-term climate targets.

2. Data centers are intensifying scope 3 water consumption
Larger AI-focused data centers can host tens of thousands of servers, each carrying multiple advanced chips. Producing these chips requires ultrapure water, which is essential for cleaning and etching during fabrication. The process is highly water-intensive, needing about 1,500 gallons of piped water to create 1,000 gallons of ultrapure water. A typical chip plant uses nearly 10 million gallons per day. As a result, each chip arrives at a data center with a significant embedded water footprint.
3. High capital costs and integration risks of liquid cooling
Liquid-cooling systems require higher upfront investment than traditional air cooling due to complex components such as pumps, tubing, and heat exchangers. Installation is more demanding, and the risk of leaks remains a major concern, as even minor leaks can damage critical hardware. Maintenance and upgrades are complicated, increasing both cost and operational risk.
4. Costly and complex retrofits for existing facilities
Retrofitting air-cooled data centers for liquid cooling is expensive because it often requires new infrastructure and server replacements. Unlike uniform air-cooling setups, liquid-cooling systems lack standardization, creating added design and maintenance complexity. Many facilities also need reinforced floors, upgraded foundations, and advanced fluid-management systems to operate liquid cooling safely.
5. Limited effectiveness and environmental risks of free cooling
Free cooling reduces energy use by leveraging naturally cool air or water, but it cannot operate year-round in many regions. Its effectiveness depends on climatic conditions, making it unreliable in warmer or highly variable environments. Free-air systems also introduce airborne contaminants, including sulfur compounds that trigger silver corrosion inside electronic components. This corrosion can produce conductive metal whiskers that cause electrical shorts and equipment failure.
Managing these risks requires additional infrastructure and operational oversight, which raises both cost and complexity for operators adopting free-cooling strategies.
6. Environmental and safety risks of immersion-cooling fluids
Immersion cooling relies on dielectric liquids such as hydrocarbons and fluorocarbons, including PFAS. These forever chemicals pose long-term environmental and health concerns because they persist in nature and do not degrade. Safety risks also arise during operation. Submerged servers can generate toxic hydrogen fluoride gas if an arc flash occurs, creating a serious hazard for workers and equipment. The limited understanding of long-term impact of PFAS compounds the challenge.
7. Limitations in reusing wastewater
Using treated wastewater for cooling remains challenging due to limited availability, added treatment needs, and regulatory constraints. Most of the water used for cooling evaporates. Data centers typically lose about 80% of the water they withdraw to evaporation during cooling, and the rest is released as wastewater.
This discharge often contains minerals, chemicals, and heavy metals, making reuse difficult without intensive treatment. Most municipal plants cannot handle the high volumes generated by large data centers, raising overload risks and potential environmental harm if contaminants reach local water bodies.
8. Low priority for sustainable water consumption
Water use remains a low priority for many data centers, even though cooling drives most consumption. Fewer than one-third track their water use, and only about half monitor it at any level. Most operators focus on power because it affects business performance, while water and waste receive far less attention. Many express there is no clear business case for tracking water, despite rising risks in water-scarce regions.
Accelerating the shift to lower data center water consumption
Cooling remains one of the biggest barriers to sustainable data center growth. Rising AI workloads and higher rack densities are accelerating power and water demand, increasing stress on local grids and freshwater systems. Improving cooling efficiency is now essential for long-term resilience.
Innovation in low-water cooling technologies will play a central role. Investment is needed to scale solutions that reduce freshwater dependence and lower the indirect water footprint of electricity use. Startups are already pushing new approaches that improve thermal performance, cut operational risk, and reduce the strain on water resources in surrounding communities.
Building a sustainable data center ecosystem will require coordinated progress. Advancing new cooling technologies, expanding renewable energy use, and strengthening water-management practices are critical steps to support the next wave of AI-driven digital infrastructure.


