Data Center Resistance: Stopping the Corporate AI Offensive

Major tech companies – OpenAI (GPT), Google (Gemini), Anthropic (Claude), Meta (Llama), and xAI (Grok) – are spending heavily to boost the computing power of their respective large language artificial intelligence (AI) models. The companies claim that this spending will transform them into systems that businesses will happily pay to use and, in the near future, lead to artificial general-intelligence-powered machines capable of autonomously solving problems and making decisions far better than humans.

This effort has produced a data-center building boom, largely the hyperscale data centers needed to train and run such advanced systems. In fact, annual spending on data-center construction (a figure that does not include the cost of the servers or land) now equals the yearly spending on office-building construction and should overtake it next year. These hyperscale data centers are enormous. For example, OpenAI’s Stargate data-center complex in Abilene, Texas, is large enough to be seen from space.

Hyperscale data centers are a social and ecological disaster, and communities across the country are now organizing to stop their construction and operation. The costs are too high even if the large language models they are designed to support were socially beneficial. But that is not the case. These models are unreliable, socially dangerous, generally undermine rather than enhance worker capacities, rely on exploited labour for their training and operation, and are a technological dead-end. Moreover, as the New York Times reports, research by McKinsey & Company finds that “nearly eight in 10 companies have reported using generative AI, but just as many have reported ‘no significant bottom-line impact.”

We need all-hands-on-deck to stop the high-tech assault on our lives, and that includes publicizing the costs of these hyperscale data centers and supporting the community resistance movement.

Sharpening our AI Focus

There is no agreed upon definition of AI, which allows companies to slap the label on all sorts of products and schemes. That said, artificial intelligence generally refers to technologies that can mimic cognitive functions commonly associated with human intelligence. These technologies are typically divided into two main subgroups: machine learning and generative AI models.

Machine learning models use algorithms to identify patterns, make decisions, and improve their performance through experience. They do not generate new content and are widely used for tasks such as facial and image recognition, email and phone call spam filtering, traffic conditions predictions and route optimization, and language translation.

Generative AI models, in contrast, create new content thanks to training on large data sets. Non-large language models specialize in non-text content such as images, video, and music. Large language models (LLMs) have the capacity to generate human-like text, and the most advanced versions (sometimes called multimodal) are now also able to respond to and manipulate audio and image inputs. It is these multimodal LLMs that are driving the AI craze and leading the largest data center operators, or hyperscalers, to build the hyperscale data centers needed to run them.

As the New York Times reported in October 2025:

“Google said it was increasing what it planned to spend on AI data center projects this year by $6-billion, after dropping nearly $64-billion on them over the past nine months.

“Microsoft said it had spent $35-billion in its latest quarter, $5-billion more than what it had told investors to expect just a few months ago. And Meta raised its spending forecast to at least $70-billion by the end of the year, which would be nearly double what it spent last year.

“Amazon also said it would be ‘very aggressive’ in adding more data centers and would spend $125-billion this year on capital expenditures – and even more next year.

“Google, Microsoft and Amazon, which are the three largest providers of cloud computing in the United States, said they did not have enough computing power to meet customer demand. That’s despite those three and Meta shelling out a combined $112-billion in just the last three months on capital expenditures, which included construction of data centers.”

Overall, says Bloomberg News, the leading “so-called hyperscale companies are projected to spend $371-billion on data centers and computing resources for AI in 2025, a 44-percent increase from the year prior.” McKinsey & Co. predict that this total will need to reach $5.2-trillion by 2030 to keep up with the demand for AI services.

Bigger than you can Imagine and More Costly

The minimum size of a US hyperscale data center is considered to be 100,000 square feet, but many of the largest and newest cover millions of square feet. Each center includes numerous buildings, the greatest number and largest being the data halls that contain the servers needed to run the AI systems. Individual data halls are often the same size or bigger than a Walmart Supercenter.

These hyperscale data centers are normally established in relatively small towns and cities where the land is cheap, energy costs are low, and governments are willing to offer attractive tax incentives. It has taken time and hard experiences, but growing numbers of working people have come to recognize that they have little to gain from having these centers in their communities. In fact, quite the opposite is true. Thus, they have begun organizing to block the construction of new centers as well as the expansion and operation of existing ones. And for solid reasons. One of the most important is the destructive consequences of data-center power consumption on energy affordability and availability.

Steam rises above the cooling towers in Google’s The Dalles data center in Oregon, US.

The regional locations with the greatest concentration of data centers, measured by capacity in megawatts, are (in order) Northern Virginia (an area known as Data Center Alley); Hillsboro/Eastern Oregon; Columbus, Ohio; Phoenix, Arizona; and Dallas, Texas. Data centers in Virginia currently consume 39 percent of the state’s electricity. In Oregon, it is 33 percent. The numbers, while lower elsewhere, have been rapidly climbing–in Ohio, the current share is 9 percent. In Arizona it is 11 percent. And of course, building continues; planned data-center construction or expansion is on pace to increase capacity by a third in Virginia and by more than half in Columbus, Phoenix, and Dallas.

Regardless of future activity, data-center energy use is already driving up energy prices for working people. Operators of state and regional energy grids have responded to the rise in wholesale energy prices from increasing data-center demand by passing the higher costs on to households, along with additional charges to cover new maintenance and expansion expenses.

As Bloomberg News describes:

“PJM Interconnection, the operator of the largest US electric grid, has faced significant strain from the AI boom. The rapid development of data centers relying on the system raised costs for consumers from Illinois to Washington, DC, by more than $9.3-billion for the 12 months starting in June, according to the grid’s independent market monitor. Costs will go up even more next year [2026].

“Baltimore residents saw their average bill jump by more than $17 a month after a power auction held by PJM reached a record high, according to Exelon Corp.’s Baltimore Gas & Electric utility. This year’s auction set another record, which will boost the average power bill in Baltimore again by up to $4 starting in mid-2026.”

The Baltimore rate increase seems to be par for the course, with the Washington Post reporting that the rising cost of powering AI data centers has pushed up home electricity bills from $10 to $27 a month across the Eastern US. And these cost pressures are set to grow. Data centers currently consume some 5 percent of America’s electricity, and predictions are that this will climb to nearly 10 percent by 2030.

But price increases are not the only cause for concern. Many analysts believe that that the power industry will be unable to ramp up fast enough to satisfy the growing energy demand expected from the data center buildout. After all, the US energy system is already struggling to update its aging infrastructure. This raises the very real possibility of brownouts in some markets within the next few years.

The power industry effort to boost energy production also has longer-term climate consequences. While some data centers are open to using renewable energy, fossil fuels remain the most desired and common choice for electricity. In fact, a number of operators are working, with government encouragement to reopen or extend the life of many of the country’s existing coal-fired power plants. It is impossible to overstate the danger this trend poses to the fight against global warming.

A second, and closely related reason for popular opposition to the establishment of hyperscale data centers is their massive use of water, which threatens community access because of its potential to deplete aquifers and drive up its residential cost. LLMs require significant computational power, which generates a lot of heat. And the more sophisticated the models, the greater the heat generation. To prevent overheating, hyperscale data centers use extensive cooling systems, typically water-intensive systems. A large data center can consume up to 5 million gallons of drinking water per day, roughly equal to the daily water use of a town of 50,000 people.

Data centers use potable or drinking water to protect their cooling systems from the corrosion-causing impurities and salts commonly found in non-potable sources. To make matters worse, data centers often treat the water they use with chemicals to prevent bacterial growth, making it unsuitable for human consumption or agricultural use. This means, as researchers from the University of Tulsa explain, “that not only are data centers consuming large quantities of drinking water, but they are also effectively removing it from the local water cycle.”

Data center location decisions are rarely made on the basis of water availability. As noted above, the most important considerations are land prices, energy costs, and taxes. This means that many data centers are built in areas that are already water-stressed. In fact, according to a Bloomberg study,

“about two-thirds of new data centers built or in development since 2022 are in places already gripped by high levels of water stress. While these facilities are popping up all over the country, five states alone account for 72% of the new centers in high-stress areas [California with 17 data centers, Arizona with 26, Texas also with 26, Illinois with 23, and Virginia with 67].

“Tech giants are racing to expand with new – and larger – data centers to support AI, consuming more resources, including water. That only adds to concerns that communities facing water shortages will have to compete with data center operators to access clean water.”

Adding insult to injury, since water rates are often structured to provide lower costs to higher-volume commercial and industrial users, data centers frequently enjoy the most attractive rates. One example, provided by the University of Tulsa researchers, offers some sense of the disparities between data center and residential rates:

“Companies are often able to negotiate better rates for water than local residents. In recent years, Google faced criticism for its plans to build a massive data center in Mesa, Arizona, after it was revealed that the company would pay a lower water rate than most residents. The deal, negotiated with the city, allowed Google to pay $6.08 per 1,000 gallons of water, while residents paid $10.80 per 1,000 gallons.”

The full list of reasons people oppose the operation and expansion of data centers is a long one. Beyond the negative consequences of data-center power and water consumption, people worry about the health effects of the noise pollution from their cooling fans and diesel backup generators and the air pollution from the diesel exhaust. They are also concerned by the visual and physical impact of these data centers on local property values and the livability of their neighborhoods. The list goes on. And so, there is resistance.

A Growing Resistance

Communities often have no idea that data centers are heading their way. That is because the companies seeking to purchase the land are not the ones who will be owning and operating the centers. Moreover, the hyperscalers that will own and operate them routinely force local government officials to sign nondisclosure agreements, thereby denying residents critical information about the aims of the purchase and the negotiated terms of any agreement.

This is the situation that residents of Bessemer, Alabama, a city of 25,000, found themselves in when, early in 2025, they learned about plans for the construction of a 4.5-million-square-foot data center. As Inside Climate News describes the situation:

“The proposed data center campus in Bessemer, if realized, would consist of 18 buildings, each larger than the average Walmart Supercenter, that would house massive server farms for data storage and processing. Located on about 700 acres of wooded land currently zoned for agricultural use, the proposed physical infrastructure would require the permanent clear-cutting of at least 100 acres of forest.

“The purchasing agent, a limited liability company, was acting as agent for the Corporation Trust Company, which is often used by big tech companies like Google to make land purchases for them. And following the script, Bessemer’s mayor, his chief of staff, and the city attorney all signed non-disclosure agreements with the developer.”

Residents soon filed suit against the city on the grounds that they were not given adequate notice of a public hearing at which the City Council was to take steps to move the plan forward. A County Circuit Judge agreed with them and placed a temporary restraining order blocking the City Council from rezoning the targeted property from agricultural to industrial.

Crowds of people have been attending City Hall hearings, demanding information and arguing against the land sale. And since no one involved in the negotiations will publicly reveal the name of the hyperscaler standing behind the curtain – Google, Microsoft, Amazon and Facebook have all been mentioned as possibilities–it has been impossible to get information about likely power- and water-consumption consequences. The battle continues.

The story is being repeated in Wilsonville, an even smaller Alabama city, which is 25 miles southeast of Bessemer. But having seen what happened there, Wilsonville residents moved quickly when they learned that a developer was looking to purchase land in their city for a hyperscale data center. They have packed town hall meetings and forced City Council members to slow down the consideration of the needed zoning changes and permits.

Community struggles against data centers are on-going in cities and towns across the US. According to Data Center Watch, “There are at least 142 activist groups across 24 states organizing to block data center construction and expansion.” Some examples of successful action:

  • In Cascade Locks, Oregon, and Warrenton, Virginia, voters recalled/voted out of office those officials that supported data center developments.
  • In Saint Charles, Missouri, thousands of people led a movement against “Project Cumulus,” a 440-acre data center; the city became the first in the nation to enact a year-long ban on data center construction. Similar bans have been approved/proposed in dozens of counties and townships, including St. Louis, Oldham County, in Kentucky and Jerome Township in Ohio.
  • Loudoun County, Virginia, has taken steps to increase regulatory scrutiny of new data center projects and provide more opportunities for public input into the decision-making process.
  • Ohio and Georgia passed laws that would put the burden on data centers, not consumers, to pay the cost of any new investments to expand the power grid.

It is important to add that while there are more data centers in the US than in any other country, some 60 percent of all data centers are located outside the US, many of which are operated by US hyperscalers. And just like in the US, and for similar reasons, there are movements in most of those countries in opposition to their operation and planned construction. The New York Times describes the situation as follows:

“In country after country, activists, residents and environmental organizations have banded together to oppose data centers. Some have tried blocking the projects, while others have pushed for more oversight and transparency…

“‘Data centers are where environmental and social issues meet,’ said Rosi Leonard, an environmentalist with Friends of the Earth Ireland. ‘You have this narrative that data centers are needed and will make us rich and thriving, but this is a real crisis.’”

No doubt working people everywhere would benefit from a greater sharing of experiences and community-based strategies of resistance. At the same time, it seems clear that there is a natural alliance waiting to be formed, one that brings together those opposing data-center operations and construction with those opposing the widespread use of LLMs in schools, workplaces, healthcare institutions, and government services. Such an alliance could serve as a solid foundation for building the kind of movement we need in order to assert popular control over the ongoing development and use of technology. •

This article first published on the Reports from the Economic Front website.

Martin Hart-Landsberg is Professor Emeritus of Economics at Lewis and Clark College, Portland, Oregon. His writings on globalization and the political economy of East Asia have been translated into Hindi, Japanese, Korean, Mandarin, Spanish, Turkish, and Norwegian. He is the chair of Portland Rising, a committee of Portland Jobs with Justice, and the chair of the Oregon chapter of the National Writers Union. He maintains a blog Reports from the Economic Front.