The disproportionate effects of AI data centers on local communities and what can be done about it


The first part of our Keep Calm and Count the Kilowatts series showed how AI prompts are just a small part of a person's daily energy use. The second part explored how AI's energy, water and carbon footprints stack up on a global scale.

But the real environmental impact here isn't the small sip of energy your individual gauge uses; It's the massive, concentrated impact that new data centers can have on the specific cities and ecosystems in which they are built.

Disproportionate effects

AI-specific data centers leave a larger, messier footprint than other types of data centers, and are large industrial facilities that impact power grids, water supplies, and air quality.

There are two main problems. Power density: Running AI often means packing high-powered GPU loads into a small area, and the resulting data center can use much more power than one of the same size that only streams Netflix.

Operators are already finding that this type of concentrated demand is forcing them to rewire substations and delay new construction because AI-powered data center expansions don't always have the power to fully operate.

In fact, enterprises are having to reshape power and cooling around AI workloads, and AI is driving higher emissions in data centers.

Second, a data center can be built that focuses on training AI models rather than delivering content to users where other data centers cannot, and these areas are often less equipped to handle the impact.

Impacts such as air pollution from the portable gas turbines used when an AI data center was built in an area where the grid could only supply 4% of its energy needs. Sites like this can end up importing diesel, burning gas on site and competing with local residents for already overstretched infrastructure.

And while data centers are not large users of water compared to other industries, they are often built in areas where they can have significant impacts on the small amount of resources available.

It's easy to blame AI alone for these impacts, but the underlying problem is lax (and many would say corrupt) regulations and laws (not to mention the politicians in charge) that make it cheaper for companies to harm the environment than to work for sustainability.

In fact, AI (and data centers in general) don't have to use water for cooling and can be carbon neutral: it simply costs more and reduces profits.

Can data centers go green?

(Image credit: Shutterstock/Tomasz Wozniak)

Right now it's a race to build new GPU farms wherever possible, which is projected to triple local energy demand by 2035. Avoiding the negative impacts of this increase is not an unknown task, or even a difficult one: it is well-researched engineering.

The key is things like better network planning so that power-hungry data centers don't overwhelm their local supply, options like dense but efficient water-cooled racks that waste less energy as heat, and regulations/incentives that mean it's more profitable for businesses to use renewable energy and rely less on local resources like water.

Although it is not yet enough, this greener approach is already being implemented. Google has what seems like an on-again, off-again relationship with its “don't be evil” mantra, but the company gets about 66% of its electricity from renewable sources, and tops that figure to 100% with offsets. Google is also experimenting with campuses located right next to wind and solar farms.

But right now, these greener approaches are (mostly) not being done out of the goodness of a company's heart: it's because if the grid can't keep up with the increasing power demands of AI, future profits could be lost.

And just because they try something doesn't mean they'll keep doing it: Microsoft canceled Project Natick underwater data center testing despite it being a success.

The missing step remains government regulation and incentives. If done correctly, it is entirely reasonable to balance data center growth with environmental responsibility and avoid any negative impacts on the local area.

And despite political opposition, renewable energy production continues to rise delightfully fast and is expected to be more than enough to meet new demand (including that for AI) for the rest of the decade.

Data centers can also help the local area, and waste heat can be a valuable community asset for heating homes and even greenhouses.

What's next?

AI Enterprise Data Center

(Image credit: Shutterstock/Gorodenkoff)

A sustainable AI future also involves using the technology to reduce emissions faster than it increases them. That could include training energy-intensive models in areas where green energy is abundant and then putting AI to work so it can help amplify and improve existing efforts to reduce environmental impact.

It also means having more conversations about the real impacts of data centers: Right now, AI companies rarely talk in detail about their energy use, even as data centers quietly become a much larger share of global emissions and big players like Google use more energy every year, and not just for AI.

It's not as simple as spending more money on higher-tech solutions, and balancing cost reduction and climate impact is an important and nuanced consideration in the age of AI data infrastructure.

Still, AI data centers can be built in areas and in ways that support the local community, but only when adequate regulations and infrastructure upgrades are also in place.

And yes, AI can have many problematic impacts as a technology, but it is also just the sudden new growth that exposed existing flawed energy systems and regulations. But recognizing and discussing these underlying issues means we can better focus on creating a genuinely sustainable AI future.

takeaway

The central fact is that one AI message (or even hundreds) is only a small fraction of most people's daily use, and small compared to luxuries like television, games, and even Christmas lights.

On a global scale, AI's energy use is significant enough to pay attention to, but it's still only a minor part of the collective race to see whether the technology will save us from ourselves or simply provide a more entertaining apocalypse.

co2

(Image credit: Shutterstock/aapsky)

Of course, there is no need to just sit back and wait to see how everything develops. Taking matters into your own hands and offsetting CO2 emissions from using AI is a rounding error on the already surprisingly low cost of becoming carbon neutral.

In fact, offsetting all of my personal carbon emissions for a year starts at about the same cost as a subscription to ChatGPT Plus.

So stay calm, count the kilowatts, and focus on where the big wins are – just remembering to turn off the bathroom light before bed will give you 250 guilt-free AI prompts.

Don't get me wrong, AI is trapped in problems and controlled by problematic people and companies, but the pessimism is not due to electricity wear. Mostly.


Not convinced AI can go green? Let me know which one you think is a better plan in the comments!

AI skeptics might also like

AI enthusiasts may also like

How we use AI

Here at TechRadar, our coverage is author-driven. AI helps with sourcing, research, fact-checking, and spelling and grammar suggestions. A human still checks every figure, source and word before something is published. We occasionally use it for important jobs like adding dinosaurs to our colleagues' photos. For a full overview, see our Future and AI page.

scroll to top