Google's latest annual environmental report reveals the true impact its recent forays into artificial intelligence have had on its greenhouse gas emissions.
Expanding its data centres to support AI developments helped the company produce 14.3 million tonnes of carbon dioxide equivalent in 2023. This represents a 48% increase on the equivalent figure in 2019 and a 13% increase from 2022.
“This result was primarily due to increased data center energy consumption and supply chain emissions,” the report’s authors wrote.
“As we further integrate AI into our products, reducing emissions may prove challenging due to increasing energy demands resulting from the increased processing intensity of AI and emissions associated with expected increases in our investment in technical infrastructure.”
WATCH: How Microsoft, Google Cloud, IBM and Dell are working to reduce climate harms from AI
Google says it cannot distinguish which component of overall data center emissions AI is responsible for
In 2021, Google committed to achieving net-zero emissions across its operations and value chain by 2030. The report states that this goal is now considered “extremely ambitious” and “will require[Google]to navigate significant uncertainty.”
The report goes on to say that the environmental impact of AI is “complex and difficult to predict,” so the company can only publish metrics for the entire data center as a whole, which includes cloud storage and other operations. This means that the environmental damage inflicted specifically as a result of AI training and use in 2023 is being kept under wraps for now.
That said, in 2022, Google engineer David Patterson wrote in a blog post: “Our data shows that ML training and inference account for only 10% to 15% of Google’s total energy use over each of the past three years.” However, this proportion has likely increased since then.
SEE: Everything you need to know about Greentech
Why AI is responsible for rising emissions from tech companies
Like most of its competitors, Google has introduced a number of AI projects and features over the past year, including Gemini, Gemma, overviews and image generation in search, and AI security tools.
Artificial intelligence systems, particularly those used to train large language models, require massive computational power, resulting in higher electricity consumption and, consequently, higher carbon emissions than normal online activity.
SEE: Cheat Sheet on Artificial Intelligence
According to a study by Google and the University of California, Berkeley, training OpenAI’s GPT-3 generated 552 metric tons of carbon, the equivalent of driving 112 gasoline-powered cars for a year. Furthermore, studies estimate that a generative AI system uses about 33 times more energy than machines running task-specific software.
Last year, Google's total data center electricity consumption grew by 17%, and while we don't know how much of this was due to AI-related activities, the company admitted that it “expects this trend to continue in the future.”
Google is not the first major tech organisation to reveal that advances in artificial intelligence are impacting its emissions and are proving difficult to manage. In May, Microsoft announced that its emissions had increased by 29% compared to 2020, largely as a result of building new data centres. “Our challenges are in part unique to our position as a leading cloud services provider that is expanding its data centres,” Microsoft’s environmental sustainability report states.
Leaked documents seen by Business Insider in April reportedly show that Microsoft has secured more than 500 MW of additional data center space since July 2023 and that its GPU footprint now supports live “AI clusters” in 98 locations globally.
Four years ago, Microsoft President Brad Smith called the company’s commitment to becoming carbon negative by 2030 a “destination for the moon.” However, in May, he admitted that “the moon has moved” since then and is now “more than five times farther away,” via Bloomberg’s Zero podcast.
Alex de Vries, founder of digital trend analysis platform Digiconimist, which tracks AI sustainability, believes that Google and Microsoft’s environmental reporting shows that tech executives aren’t taking sustainability as seriously as AI development. “On paper they may say so, but the reality is that they are currently clearly prioritizing growth over meeting those climate goals,” he told TechRepublic in an email.
“Google is already struggling to meet its growing energy demand from renewable energy sources. The carbon intensity of each MWh Google consumes is rapidly increasing. Globally, we only have a limited supply of renewable energy sources available and the current trajectory of AI-related electricity demand is already excessive. Something will have to change dramatically for those climate goals to be achievable.”
Google's skyrocketing emissions could also have a knock-on effect on companies that use its AI products, which have their own environmental goals and regulations to meet. “If Google is part of their value chain, Google's increased emissions also mean that their Scope 3 emissions will increase,” de Vries told TechRepublic.
How Google manages its AI emissions
Google’s environmental report highlights a number of ways the company is managing the energy demands of its AI developments. Its latest Tensor processing unit, Trillium, is 67% more energy efficient than the fifth generation, while its data centers are 1.8 times more energy efficient than typical enterprise data centers.
Google data centers now also offer roughly four times more processing power with the same amount of electrical power, compared to five years ago.
In March 2024 at NVIDIA GTC, TechRepublic spoke to Mark Lohmeyer, vice president and general manager of Compute Infrastructure and AI/ML at Google Cloud, about how its TPUs are becoming more efficient.
He said: “If you think about running a highly efficient form of accelerated computing with our own internal TPUs, we leverage liquid cooling for those TPUs which allows them to run faster, but also in a much more energy-efficient way and as a result, in a more cost-effective way.”
Google Cloud also uses software to manage uptime sustainably. “What we don’t want is to have a bunch of GPUs or any kind of compute deployed that are using power but not actively producing the results we’re looking for,” Lohmeyer told TechRepublic. “So driving high levels of infrastructure utilization is also key to sustainability and energy efficiency.”
Google’s 2024 Environmental Report says the company is managing the environmental impact of AI in three ways:
- Model optimization: For example, it increased the training efficiency of its fifth-generation TPU by 39% with techniques that accelerate training, such as quantization, where the precision of the numbers used to represent model parameters is reduced to reduce the computational load.
- Efficient infrastructure: Its fourth-generation TPU was 2.7 times more energy efficient than the third-generation one. By 2023, Google’s water management program offset 18% of its water consumption, much of which goes to cooling data centers.
- Emissions reduction: Last year, 64% of the energy consumed by Google's data centers came from carbon-free sources, including renewable sources and carbon capture schemes. It also implemented carbon-smart computing platforms and demand response capabilities in its data centers.
Additionally, Google’s AI products are being designed to address climate change more broadly, such as fuel-efficient routing in Google Maps, flood prediction models, and the Green Light tool that helps engineers optimize traffic light timing to reduce stop-and-go traffic and fuel consumption.
AI demand could outstrip emissions targets
Google says that the electricity consumption of its data centers (which, among other things, power its artificial intelligence activities) currently only accounts for about 0.1% of global electricity demand. In fact, according to the International Energy Agency, data centers and data transmission networks are responsible for 1% of energy-related emissions.
However, this figure is expected to increase significantly in the coming years, with data center electricity consumption projected to double between 2022 and 2026. According to SemiAnalysis, data centers will consume around 4.5% of global energy demand by 2030.
Considerable amounts of energy are needed to train and run AI models in data centres, but manufacturing and transporting chips and other equipment also contributes. The IEA has estimated that AI in particular will use ten times more electricity in 2026 than in 2023, thanks to rising demand.
WATCH: AI causes fundamental power and cooling problem in Australian data centers
Data centres also require huge amounts of water for cooling, and even more so when performing energy-intensive AI calculations. A study by the University of California, Riverside, found that the amount of water extracted for AI activities could reach the equivalent of half of the UK’s annual consumption by 2027.
Increased demand for electricity could push tech companies to turn to non-renewable energy
Tech companies have long been big investors in renewable energy – Google’s latest environmental report states it purchased more than 25 TWh in 2023 alone. However, there are concerns that rising energy demand as a result of its AI activities will keep coal- and oil-fired plants running that would otherwise have been decommissioned.
For example, in December, county supervisors in Northern Virginia approved the construction of up to 37 data centers on just 2,000 acres, prompting proposals to expand the use of coal-fired power.