U.S. data centers consumed 17 billion gallons of water in 2023. In some communities, signed contracts already demand more than the local system can sustain. A Virginia court ruled this data is public. A journalist's $86 FOIA forced Google to comply. Communities that know what to ask for are winning. This section shows you how.
Understanding water consumption requires three distinctions the industry consistently blurs. Withdrawal is water taken from a source. Consumption is water permanently lost — primarily through evaporation. Discharge is water returned to wastewater systems. In evaporative cooling — still the dominant technology — roughly 80% of withdrawn water evaporates, never returned. Companies typically report the metric that makes them look best. Always ask which definition is being used.
The indirect water footprint is almost never disclosed. Including water consumed through electricity generation raises total U.S. data center water use from 17 billion to an estimated 228 billion gallons annually — a 12× difference that rarely appears in corporate sustainability reports.
An executed agreement projects peak demand of 2–8 million gallons of drinking water per day from Carvins Cove Reservoir — the Roanoke Valley's primary drinking supply. Google's largest single customer at this authority, a Coca-Cola plant, averages 260,000 gallons per day. Google's demand: 7–30× larger. The water authority fought in court to keep these numbers secret. Lost twice.
xAI's facility draws from the Memphis Sand Aquifer — overlain by unlined coal ash ponds containing arsenic. 35 natural gas turbines installed without air permits. Projected use: 5 million gallons per day. The community is predominantly Black with elevated asthma rates. NAACP filed a Clean Air Act suit. The facility operated for months before permits were applied for.
The Cambrian-Ordovician aquifer serving Joliet is projected depleted by 2030 under current extraction rates — accelerated by data center demand growth. The city is spending $1 billion to connect to Lake Michigan, a cost substantially driven by industrial demand that the original aquifer was not sized to support.
An 800 MW facility was proposed directly above a high-susceptibility aquifer. The eight-page development agreement contained zero binding groundwater protections. Community members weren't shown the agreement before the vote. The gap between verbal presentation and contract language is the textbook case for demanding legal review before any public vote.
Texas data centers consumed an estimated 49 billion gallons in 2025. Current trajectory for 2030: 399 billion gallons — equivalent to draining Lake Mead 16 feet in a year. Texas is already in the critical zone of the Ogallala Aquifer's decline. Adding data center demand at this scale to an already-strained system is a policy choice being made largely without public deliberation.
On Earth Day 2026, Brockovich and national water expert Robert Bowcock publicly named the data center water crisis as following the same pattern she has spent her career documenting: corporate opacity, inadequate regulatory oversight, communities bearing costs they didn't consent to and often don't know about. The parallel isn't about contamination — it's about the systematic suppression of information about how public resources are allocated. The Roanoke Rambler case, The Dalles lawsuit, Wisconsin's trade secret argument — this is the same playbook she has documented for decades in different industries.
Evaporative cooling became the hyperscale standard because it lowers PUE — Power Usage Effectiveness — making facilities more energy-efficient. But it raises WUE — Water Usage Effectiveness — meaning more water consumed per unit of compute. The industry optimized for PUE because energy cost money and water was cheap and abundant. That calculus has shifted in water-stressed regions. But the infrastructure was already built for the old calculus — and retrofitting it is expensive enough that voluntary adoption of alternatives is slow without mandates.
"There are few resources more precious than water."
— Judge Leisa Ciaffone, Roanoke Circuit Court, November 2025All four major hyperscalers have pledged "water positive" by 2030. As of 2023–2024: Google had replenished only ~18% of its 120% target while consumption rose 17% year-over-year to 5.2 billion gallons. Microsoft acknowledged "the moon has gotten further away" — absolute consumption increased even as efficiency improved. AWS reports 53% progress. Meta's Beaver Dam facility uses close to zero water — but this is one facility. The commitments are real. The system-wide trajectory is not yet matching them.
Water conflicts between industrial users and communities are not new. What distinguishes the data center moment is the speed of scale, the opacity of consumption, and the fact that an industry presenting itself as the future is using a resource that is becoming scarcer — in many cases without the affected communities having any meaningful say in the terms.
The pattern Erin Brockovich has spent her career documenting maps directly onto what's happening now: industrial users negotiating access to public resources through processes designed to minimize community awareness. The Hinkley groundwater case involved a utility that knew about contamination and chose not to disclose it. The Dalles case involved Google paying city legal bills while the city sued a newspaper to keep water data secret. The mechanism differs. The dynamic is identical.
History also offers models of change. The Clean Water Act of 1972 passed because industrial pollution had become undeniable and the political coalition demanding accountability had grown large enough to overcome industry opposition. It didn't happen spontaneously — it happened because communities organized, documented harm, and connected individual local cases into a national picture. That is precisely the moment we are in now.
Google built its first data center in The Dalles in 2006. By 2024 it consumed a third of all city water — usage nearly tripling in five years. The Oregonian spent 13 months in court, with Google paying the city's legal bills against the newspaper's public records request. After settling in late 2022, Google declared it would no longer seek trade-secret protections for site-level water data nationwide. That commitment came after a 13-month legal fight — not as a voluntary first move. The lesson: transparency doesn't happen on its own. It happens when communities are willing to fight for it, with sufficient persistence.
The EU's Energy Efficiency Directive requires data centers above 1 MW to report water consumption publicly from October 2025. Germany's Energy Efficiency Act imposes fines up to €100,000 for non-compliance. Ireland requires 80% renewable energy and grid operator veto authority. Singapore awards data center capacity competitively — only 380 MW total in 2023–2024 — with strict efficiency requirements. The U.S. has no equivalent federal framework. Only 51% of U.S. operators even track water consumption. This is not a technology gap. It is a governance choice.
"Only 51% of U.S. data center operators track water consumption. The gap benefits those who prefer no accountability."
Water is the most acute concern. Land use, noise, and air quality are significant and underreported. Virginia's ~4,000 diesel backup generators — each 600–3,500 kW — pose concentrated air quality risks during regional outages. Loudoun County's carbon emissions rose over 50%, attributed to data centers. 13% of the county's perennial streams are now impaired, partly from stormwater runoff from millions of square feet of impervious surfaces. Data center noise reaches 55–85 dB exterior and up to 96 dB interior. Low-frequency noise travels farther, penetrates buildings, and is associated with sleep disruption, migraines, and cardiovascular impacts.
The most important thing to understand about data center water consumption: the technology to eliminate most of it exists and is deploying — just not universally, and not fast enough. This is not a situation where we're waiting for a breakthrough. We're waiting for policy to catch up with available technology.
Microsoft committed to zero-water cooling for all new builds from August 2024, saving 125+ million liters per facility annually. WUE improved from 0.49 to 0.30 L/kWh between 2021 and 2024. Meta's newest facilities use closed-loop systems consuming effectively zero operational water. These systems circulate coolant in sealed loops — no evaporation, no water loss. The tradeoff is slightly higher energy use for mechanical cooling. The technology works at scale. It needs to become mandatory for new builds, not a competitive differentiator for companies choosing to lead.
AWS uses recycled water at 24 locations with plans to expand to 120+. Loudoun Water's 20-mile recycled pipeline delivered 736 million gallons in 2024. Google's Douglas County, Georgia facility runs primarily on recycled municipal wastewater. Despite these examples, alternative water sources still contribute less than 5% of typical data center supply. The majority still rely on potable municipal water. That is a policy failure, not a technical barrier.
In hot climates, data center waste heat can drive desalination. Multi-effect distillation operates below 70°C — matching data center output temperatures. Cloud&Heat in Germany partners with a seawater desalination plant driven by data center excess heat: a circular model where data centers co-generate freshwater instead of consuming it. In Gulf states where AI ambitions and water scarcity collide most acutely, this approach has transformative potential. It exists. It's commercial. It requires intentional siting and planning — neither of which the current development process incentivizes.
Servers submerged in dielectric fluid achieve 90%+ cooling energy reduction and eliminate evaporative water loss. Growing at 18–26% annually but under 5% of total data center cooling. Per-rack installation exceeds $50,000 — 3× air-cooled equivalents. Retrofit complexity for operating facilities is high. The economics require either a mandate or a meaningful water pricing signal to accelerate deployment beyond early adopters.
"This is not a situation where we're waiting for a breakthrough. We're waiting for policy to catch up with available technology."
Marana, Arizona banned municipal water for data center cooling entirely. Tucson requires water conservation plans before approval. Minnesota's HF 16 requires pre-application evaluation for projects using 100M+ gallons annually and public disclosure of withdrawals. EU Energy Efficiency Directive mandates reporting from October 2025, rising waste heat reuse requirements through 2028. These are enforceable standards — not aspirations. They demonstrate that mandates work and communities survive them.
Nordic countries have largely eliminated the water problem by combining free-air cooling with waste heat district heating. Meta's Luleå, Sweden facility uses 100% free-air cooling year-round, no mechanical chillers. Google's Hamina facility provides 80% of local district heating. These facilities consume near-zero operational water and produce community benefit in the form of heat supply. The Nordic model doesn't directly translate to hot U.S. climates — but the principle does: design the facility as community infrastructure, not extraction infrastructure. The hot-climate pathway is closed-loop cooling plus waste-heat desalination. The precedent exists. The will to mandate it does not yet.
The Roanoke Rambler case is the clearest strategic template available. Henri Gendreau, a local journalist, filed an $86 writ of mandamus after the Western Virginia Water Authority redacted water usage numbers from Google contracts. On November 5, 2025, Judge Ciaffone ruled that water use from public sources is public information — not proprietary. Google appealed. Lost. The authority appealed. Lost. After lawyers filed for civil contempt, records were released in February 2026. One person, $86, and persistence produced the first such ruling in Virginia history. Every community that's been told its water contract data is proprietary now has a precedent to cite.
Water is different from electricity. You can build more electricity generation. You cannot manufacture groundwater that took thousands of years to accumulate in an aquifer. When an aquifer is depleted, the communities above it bear consequences no rate adjustment, no tax revenue, and no corporate sustainability report can undo.
The ethical question about data center water use is not whether companies have the legal right to negotiate water contracts. It's whether the process — often through NDAs, shell companies, and code names — is compatible with the public character of the resource being allocated. When a water authority signs an NDA covering the terms of a public water contract, it uses a legal instrument designed to protect trade secrets to shield the allocation of a public resource from the public that owns it. That is not a narrow legal question. It is a question about what public resources are for.
In California, nearly one-third of data centers are located in the top 10% most polluted census tracts. In Memphis, xAI chose a site in a historically Black community already carrying disproportionate pollution burden. The NAACP's suit is not merely about one facility — it's about the documented pattern of siting decisions that consistently place industrial loads in communities with the least political and legal capacity to challenge them. Environmental justice is not a separate issue from data center infrastructure. It is the issue, in specific ZIP codes, right now.
"When an aquifer is depleted, no rate adjustment and no tax revenue can undo it."
Judge Ciaffone's November 2025 ruling should be the national standard: water usage from public sources is public information. Minnesota's mandatory disclosure requirements should be the national floor. The EU's mandatory reporting framework, operational since September 2024, is the model. None of these require companies to stop using water. They require honesty about how much public water is used. That is a minimal ethical bar — and most of the U.S. industry is currently below it.
"Water positive" pledges typically rely on offset models: watershed restoration projects geographically and temporally disconnected from actual consumption. A restoration project in New Mexico doesn't help a community near an Iowa data center consuming its reservoir. Meta's own SEC filings reveal that despite claiming 100% renewable energy matching, over 50% of electricity at U.S. facilities comes from fossil fuels — the gap filled by purchasing certificates. The same accounting logic applies to water. Purchasing credits for water restored elsewhere does not replace water consumed here. This is not a minor accounting detail. It is the mechanism by which "water positive" pledges avoid local accountability.
This section applies the AI Thinking Model™ — a framework for critical thinking, wisdom, innovation, strategy, and ethics developed by Liz B. Baker, Global Institute for AI & Humanity. Learn more →