environmental-impact
Google sued a city to keep its water usage a secret. Its AI was drinking 30% of the local supply.
AI training requires millions of liters of freshwater, yet Big Tech fights to keep usage data secret. Is your ChatGPT query draining local aquifers?
For years, Silicon Valley has successfully marketed the "Cloud" as a weightless, ethereal dimension where data exists without friction. We are told that Artificial Intelligence is a product of pure math and silicon, a digital genie that requires nothing more than a few billion parameters and a visionary CEO. But the physical reality is significantly grubbier and far wetter. Behind every LLM is a massive, heat-generating server farm that must be cooled to prevent it from melting into a puddle of expensive slag. To do this, tech giants are increasingly turning to evaporative cooling—a process that effectively turns local freshwater into steam to keep the GPUs happy.
Despite corporate "water positive" pledges, the rapid expansion of AI infrastructure is creating immediate, localized water crises that are systematically obscured from the public through "trade secret" legal claims. As companies like Google and Microsoft race to achieve AGI, they are leaving behind a "blue footprint" that is often documented only after local governments are dragged through the courts. The analytical framework here is simple: we must weight the physical resource consumption of these clusters against the transparency of the entities operating them. Currently, the receipts do not look good for the neighbors of the server farm.
What happened: The Thirsty Training of GPT-3
The scale of water consumption in AI training is difficult to visualize until you look at the raw data. Training a single large model like GPT-3 in Microsoft’s state-of-the-art U.S. data centers is estimated to have directly evaporated 700,000 liters of clean freshwater, according to research by Li et al. at UC Riverside. This figure represents Water Consumption, which is the portion of water withdrawn that is evaporated and not returned to the local source. It is the literal disappearance of water from the local utility's ledger.
A standard conversation consisting of approximately 20 to 50 questions and answers with ChatGPT "consumes" a 500ml bottle of water through data center cooling, according to the Making AI Less Thirsty report.
This consumption isn't just an abstract environmental cost; it is a direct tax on municipal infrastructure. In July 2022, while residents of West Des Moines, Iowa, were under water-shortage warnings, Microsoft’s data center cluster was busy drinking. Documents obtained by the Associated Press show that Microsoft pumped 11.5 million gallons of water to its cluster during that period—accounting for nearly 6% of the city’s total water supply. The physical reality of server cooling stands in stark contrast to the digital abstractness of the software it hosts. According to the Microsoft 2023 Environmental Sustainability Report, the company's global water use spiked 34% in a single year, reaching nearly 1.7 billion gallons.
Why it matters: The Trade Secret Loophole
The most alarming aspect of this expansion is not just the volume of water used, but the lengths to which tech companies go to hide it. In The Dalles, Oregon, a city situated along the Columbia River, Google’s presence became a legal flashpoint. For over a year, the city government actually sued to keep Google's water usage a secret after a public records request was filed by The Oregonian. The city, acting on Google's behalf, claimed that the amount of water used to cool the data centers was a "trade secret" that would harm Google's competitive advantage if revealed.
A settlement eventually forced the disclosure of the receipts. It turned out that Google’s data centers were using nearly 30% of the city’s total water supply, as reported by The Oregonian / OregonLive. This is the "Trade Secret" loophole in action: using NDAs and litigation to prevent communities from knowing the ecological cost of their new corporate tenants. While Google’s total water consumption rose by 20% in 2022 to approximately 5.6 billion gallons (according to the Google 2023 Environmental Sustainability Report), the local impact is often far more severe than the global average suggests.
WUE (Water Usage Effectiveness) is the industry-standard metric defined as the ratio of annual site water use to the energy used by IT equipment (L/kWh).
The imbalance of power is documented in the way these deals are struck. Local water works, often underfunded and eager for the tax revenue a tech giant brings, frequently sign agreements that include strict confidentiality clauses regarding resource usage. This allows the companies to maintain their public-facing "sustainability" image while privately depleting local aquifers.
The Carbon-Water Trade-off
Defenders of the current cooling infrastructure argue that water-based cooling is significantly more energy-efficient than air-based cooling, thereby reducing the overall carbon footprint of AI. Microsoft and Google have both emphasized that water cooling can be up to 80% more energy-efficient, which reduces the strain on the electrical grid and lowers greenhouse gas emissions. This is often framed as a "net positive" for the planet.
However, this creates what researchers at UC Riverside call a "localized water stress" paradox. While global carbon reduction is a valid and necessary goal, it does not help a community whose drinking water supply has been depleted by 30% to support a data center. Reducing global emissions is a long-term, distributed benefit; losing a third of your municipal water is an immediate, localized catastrophe. The argument that we must sacrifice local water to save global carbon is a convenient framing for companies that find water to be a cheaper cooling medium than electricity.
As Shaolei Ren, lead researcher at UC Riverside, told the Associated Press, "Most people are not aware of the water usage associated with ChatGPT. If you don’t know about the usage, then there is no way we can start conserving the resources."
What's next: The Sustainability Paradox
The demand for AI is not slowing down, and neither is its thirst. Global AI demand is projected to account for 4.2–6.6 billion cubic meters of water withdrawal by 2027—roughly half the annual water withdrawal of the United Kingdom, according to the UC Riverside study. In response, Big Tech has pivoted to the concept of being Water Positive, a corporate promise to return more water to freshwater sources than the amount consumed by 2030.
But "Water Positive" is a slippery term. It often involves funding projects like leak detection in distant cities or restoring wetlands hundreds of miles away from the data centers that are actually consuming the water. While these projects have ecological value, they do not replenish the specific aquifer or river being tapped by a local cluster. It is essentially a system of "water offsets" that functions much like carbon offsets—allowing the physical depletion of a resource in one place to be "balanced" by a checkbook elsewhere.
To address this, there is a growing need for a standardized WUE disclosure. Without mandatory, public reporting of water usage at the site level, the public is forced to rely on aggregate global numbers that mask the severity of local crises.
Conclusion: The Hidden Debt
The evidence from West Des Moines and The Dalles supports the thesis that AI infrastructure expansion is creating localized water crises that are being systematically obscured. The "trade secret" defense used by Google and other operators is a documented legal strategy designed to prevent public scrutiny of resource consumption. While the energy efficiency of water cooling is a factual engineering reality, it is being used as a shield to justify the depletion of public utilities.
Corporate transparency remains the only viable path to conservation. As long as water usage is treated as a proprietary secret rather than a public concern, the environmental cost of AI will remain a debt that local communities are forced to pay without their informed consent. The "Cloud" isn't made of vapor; it’s made of the water we drink, and right now, the AI is drinking faster than the rest of us.