The Hidden Cost of AI

ByLoading...
Published May. 17, 2026, 12:49 AM

The Hidden Cost of AI

By: Brandon Foley

We’ve all heard of the impending AI push. Another form is generative AI, which allows you to put in a prompt, and it outputs your answer or picture, but how many people have dove into it to ask what it demands in return? It’s worth seeing the rising trends of power requirements.

Data centers, which are places that process information, need resources to function. They are owned and operated by data giants like Amazon, Google, Palantir, OpenAI, and Meta. These have popped up in Silicon Valley, the New York Metropolitan area, and Dallas, among others.

One such example of the resources is land to build upon. Regional Plan Association says these processing facilities are, on average, anywhere from 100,000 square feet, the size of 40 decent-sized homes, with the average regional size being 143,000 square feet. Hyperscale data centers are far larger facilities meant to handle massive workloads. These can reach over one million square feet, with Google’s first hyperscale data center being 1.3 million square feet, the size of over 3,611 football fields.

AI data center owners have downplayed the power it uses, with the CEO of OpenAI, Sam Altman, comparing it with manpower: “One of the things that is always unfair in this comparison is people talk about how much energy it takes to train an AI model ... but it also takes a lot of energy to train a human.” In addition, he said, “It takes like 20 years of life, and all the food you eat before that time, before you get smart.”

This does still present issues of the initial water usage and the scale. According to data from the non-partisan, private think tank, Lincoln Institute of Land Policy, data centers use 5 million gallons of water a day, as much as a small town for the mid-size ones, and the larger ones need 5 million, as much as a city of 50,000 people.“ A study by the Houston Advanced Research Center (HARC) and the University of Houston found that data centers in Texas used 49 billion gallons of water in 2025, and as much as 399 billion gallons in 2030. That would be equivalent to drawing down the largest reservoir in the US—157,000-acre Lake Mead—by more than 16 feet in a year.”

The centers require large amounts of energy, adding further delays to switching to better fuels and driving up energy costs. According to the Regional Plan Association, ”By 2028, data centers could consume up to 12% of total electricity in the United States, up from 4.4% in 2023, increasing from 176 TWh up to potentially 580 TWh - the equivalent of adding eight New York Cities to the country.” Each ChatGPT3 query uses 0.3 MegaWatt Hours to 40 MegaWatt Hours. For perspective, each household uses 10 megawatt-hours a year. Per the International Energy Agency, this is using as much electricity as 10-25,000 residents. Hyperscale data centers can use as much electricity as 100,000 homes or more.

“Meta’s Hyperion data center in Louisiana, for example, is expected to draw more than twice the power of the entire city of New Orleans once completed. As the Associated Press said, another Meta data center planned in Wyoming will use more electricity than every home in the state combined.” While some may say that water recycling could work, it presents issues such as XAI’s 9% reduction per a Memphis, Tennessee licensing agreement for a water recycling facility.

Activists have protested their development in cities and states due to these very concerns. Food and Water Watch, as well as Greenpeace and Friends of the Earth, among many others, have signed onto a letter sent to members of Congress to support a moratorium on approving and constructing new data centers.

Sources: