According to original research by The Washington Post and the University of California, Riverside, ChatGPT with GPT-4 uses about 519 milliliters of water — a little more than a 16.9-ounce bottle — to type a 100-word email. This extravagant use of resources can worsen man-made drought conditions, particularly in already dry climates.
The Washington Post article is based on the research paper, “Making AI Less ‘Thirsty’: Uncovering and Addressing the Secret Water Footprint of AI Models,” by Mohammad A. Islam of the University of Texas at Arlington and Pengfei Li, Jianyi Yang and Shaolei Ren of the University of California, Riverside. Journalists Pranshu Verma and Shelly Tan and their editing team used public data for their calculations of water footprint and electricity consumption estimates, as detailed in their article.
How much water and electricity does ChatGPT require?
The Washington Post and the University of California, Riverside, analyzed the electricity needed to run generative AI servers and the water needed to keep them cool. The amount of water and electricity used at specific data centers can vary depending on the climate in which those centers are located. Washington state and Arizona have particularly high water usage.
In areas where electricity is cheaper or more abundant than water, data centers could be cooled through an electrical system rather than with water-filled cooling towers, for example.
Other findings include:
- If one in ten working Americans (about 16 million people) writes a single 100-word email using ChatGPT per week for a year, the AI will need 435,235,476 liters of water. That’s roughly equivalent to all the water consumed in Rhode Island in a day and a half.
- Sending a 100-word email using GPT-4 requires 0.14 kilowatt-hours (kWh) of electricity, which, according to The Washington Post, is equivalent to leaving 14 LED light bulbs on for an hour.
- If one in ten working Americans writes a single 100-word email using ChatGPT weekly for a year, the AI will consume 121,517 megawatt-hours (MWh) of electricity. That’s the same amount of electricity that all households in Washington DC consume for 20 days.
- GPT-3 training required 700,000 liters of water.
In a statement to The Washington Post, OpenAI representative Kayla Wood said the ChatGPT creator is “constantly working to improve efficiency.”
SEE: Tech giants can hide greenhouse gas emissions from AI projects by accounting for market-based emissions.
How much electricity does it take to generate an AI image?
In December 2023, researchers from Carnegie Mellon University and Hugging Face found that it takes 2,907 kWh of electricity per 1,000 inferences to generate an AI image; this amount varies depending on the size of the AI model and the resolution of the image. Specifically, the researchers tested the power consumption of the inference phase, which occurs every time the AI responds to a cue, as previous research had focused on the training phase.
While The Washington Post report focused on the high cost of a relatively small AI message (an email), the cost of using AI for more rigorous tasks only goes up from there. Image generation generated the most carbon emissions of all the AI tasks that the Carnegie Mellon University and Hugging Face researchers tested.
Over-reliance on AI can have negative consequences for both the Earth and the bottom line.
Resource-intensive AI sacrifices current gains for worsening drought and increasing pressure on the power grid. Generative AI can also alienate customers: A Google Gemini announcement in August generated a negative consumer reaction. A July Gartner survey found that 64% of 5,728 customers would prefer not to encounter AI in customer service.
Organizations must find ways to incentivize long-term thinking when it comes to the technology employees choose to use on a day-to-day basis. Creating an environmental policy (and adhering to it) can increase customer trust in a company and help spread profits over the long term.
“Many of the benefits of generative AI are speculative and may appear in the future as companies rapidly explore various use cases that could lead to broad adoption,” Benjamin Lee, a professor of engineering at Penn, said in an email to TechRepublic. “But many of the costs of generative AI are real and incurred immediately as data centers are built, GPUs are powered, and models are deployed.”
“Companies should be assured that, historically, a widely used technology becomes increasingly efficient as computer scientists repeatedly and incrementally optimize the efficiency of the software and hardware over years of constant research and engineering,” Lee said. “The problem with generative AI is that the use cases, software applications, and hardware systems are evolving rapidly. Computer scientists are still exploring the technology, and there is no clear goal for their optimizations.”
One way to mitigate the environmental impacts of AI is to operate data centers with renewable energy (wind, solar, hydroelectric, or nuclear), said Akhilesh Agarwal, chief operating officer of vendor management firm apexanalytix, in an email to TechRepublic.
“It is critical for companies implementing AI technologies to be aware of the potential environmental costs if they do not invest in sustainable practices, as uncontrolled growth of AI could exacerbate global resource consumption issues,” Agarwal said.
On the other hand, AI can “optimize processes, reduce inefficiencies and even contribute to sustainability efforts,” Agarwal said, and its impact should be measured relative to the carbon emissions of a human workforce performing the same tasks.