When It Rains, It Pours: AI Data Centers and Water Consumption
1:09 PM, Feb. 14, 2026
As of February 2026, it seems like just about every company has announced that it will or already has implemented some AI initiative. There are the obvious suspects like Meta and Apple, but even non-tech companies like KFC and Gucci have hopped aboard the AI train. The surge in AI interest from both the public and private sectors requires significant investment in the required computing infrastructure, specifically the construction of AI data centers. Indeed, in 2025 the US government made a 500-billion-dollar pledge with OpenAI to build some of these data centers.
Unlike traditional data centers, AI data centers are specialized for AI and machine learning. They process more data and perform more tasks than what traditional data centers are designed for. Correspondingly, these AI data centers also require significantly more resources to power.

As anyone who has left their laptop running on their lap for too long knows, computers get hot, especially when crammed together in a data center. Really hot. Additionally, the GPUs and microchips processing the requisite data for AI are computing and handling significantly more data than any individual’s laptop. To prevent overheating, AI data centers require a consistent means of cooling down essential components. The primary solution––so far––involves consuming large amounts of water to cool the chips being used. The water can be used in multiple different ways, such as evaporative cooling or liquid cooling, but ultimately it works as a liquid coolant to absorb heat and lower the temperature of the GPUs involved.
The cooling process is water intensive and electrically demanding. A single AI data center can guzzle more water than a town. Likewise, AI use on an individual level still has far reaching repercussions. Asking ChatGPT to generate a 100-word response consumes over 2 cups of water and enough electricity to power 14 LED light bulbs for an hour (.14 kWh). ChatGPT alone devours enough power daily to light up the Empire state building for a year. If AI growth continues at its current pace, it is estimated that by 2030, AI-related projects would release between 24 and 44 million metric tons of carbon dioxide and drain around 1000 million cubic meters of water per year (roughly the equivalent of the annual water usage of 6-10 million American homes). Additionally, these centers are largely being constructed in water scarce regions––like the Mountain West in the United States––where land is usually cheaper. Water is being sourced to these data centers to cool these boiling servers and evaporates in the process, purging what is usually potable drinking water, depriving locals of their already limited water. Individuals living next to data centers are being placed in a situation where they have to compete with these centers, backed by these large companies, and are decisively losing.
The lack of transparency in the industry contributes to unchecked environmental harm: communities are left unaware of what is happening in their backyard and, thus, are ill-equipped to combat these issues.
Despite the environmental pressures that AI data centers place on the environment, it is improbable that a moratorium against further construction will be successful, especially as AI becomes more integrated in our economy. Instead, it is important to investigate sustainable practices that companies can adopt to decrease their current environmental impact. One of the first steps would be to develop these AI data centers in less water-stressed areas and to improve water cooling efficiency. Certain companies like AWS are looking into projects like rainwater harvesting and using treated water at their centers, attempting to be water neutral by producing or saving water equal to usage. Some European countries have explored implementing a way to harness the waste heat produced by data centers to warm homes. Lastly, the adoption of more renewable energy sources and the subsequent decrease in fossil fuel consumption would result in freeing up billions of gallons of water, which could help combat the growing strain on our environment created by AI. To that end, Meta has recently announced various nuclear energy projects to meet the energy needs for its upcoming data center supercluster as an alternative to traditional fossil fuels.
To deal with the growing impact of AI data centers, the government needs to adopt a more proactive and involved approach to AI regulation. Currently, AI data centers are able to successfully hide information regarding their water and energy consumption, exacerbating the issues already associated with their massive resource consumption. In New Jersey, representatives have introduced legislation enabling the EPA to collect data directly from these centers on their environmental impacts and energy consumption. The lack of transparency in the industry contributes to unchecked environmental harm: communities are left unaware of what is happening in their backyard and, thus, are ill-equipped to combat these issues. Similarly, in the federal government, Senator Markey introduced a bill that would address this by allowing the EPA to carry out an investigation on the environmental impact of artificial intelligence. Significantly, laws like this open the door for future legislation addressing sustainability in AI data centers. Although these initial steps are helpful, the fight over AI data centers is still in its infancy and requires significantly more attention. The legal battle waged over the next decade will have potential environmental repercussions that will be felt for generations to come.
Reed Seckinger
Reed received his undergraduate degree from Elon University, where he majored in political science and minored in finance. As a 2L at UNC Law, Reed is a JOLT staff member, Dean’s Fellow, and serves on the board of the Media Law Society.