How Your Conversation with ChatGPT Contributes to Climate Change

Article In The Thread
iPhone screen close-up ChatGPT conversation asking “what is climate change?”.
Domenico Fornas/Shutterstock
April 2, 2024

Since ChatGPT was unleashed upon humanity, tech companies have raced to put artificial intelligence (AI) into their offerings, prompting Microsoft to call 2023 “the Year of AI.” Recent reports of Google’s decision to integrate the Gemini image generator on iPhones and in Google Image searches exemplifies a pervasive trend: AI is everywhere. To be sure, the ability to generate text, images, and videos with just a few prompts—as new AI models allow you to do—is a technological marvel, but beneath the simplicity of that text box lies the massive physical reality and environmental footprint of the resources required to run these AI models.

Building a model like GPT can be resource intensive, and those costs pale in comparison to the resources needed to actually run it. AI’s power consumption comes in two phases: the energy required to train the model, and the energy required to run the model.

But what is the exact energy toll of training and running AI systems? While many AI developers don’t share numbers on the carbon footprint of training models, Meta gave us a sense when it shared its open-sourced model, LLaMA. According to Meta’s numbers, training this large language model (LLM)—the same kind of AI used for things like ChatGPT—produced an estimated 539 tons of carbon dioxide (CO2). That is equivalent to the electricity used by more than 100 hundred average U.S. homes in a year. Training GPT-3, a predecessor of ChatGPT, is estimated to have used as much electricity as 177 homes.

That seems like a lot of energy for one AI model, but that only accounts for the training and not the energy it costs to run these models, especially when power demands increase as people actually use them. To better understand why AI has such a huge carbon footprint, it is helpful to think about the heavy infrastructure required for such tech to work.

“To prevent new AI models from becoming ecological catastrophes, AI experts need to be intentional about how they design them.”

Most modern AI setups involve running models on several servers living in one or more data centers. A typical server built for AI contains multiple Graphics Processing Units (GPUs)—a type of chip originally developed for video game graphics that has become a cornerstone of AI. These optimized servers use a lot of power and generate a lot of heat, and just like your laptop, servers will produce more heat when they are used more intensely.

In a data center, these power-hungry heaters are stacked in six- to eight-foot tall towers organized in rows. These massive stacks must be aggressively cooled, lest it all catch on fire. Cooling the servers requires numerous fans attached to the servers themselves, along with blasts of air conditioning (or swamp cooling) for the entire building—leaving the inside of these data centers loud and cold. Because servers are supposed to be running 24/7, data centers also tend to have large, onsite diesel generators that kick in when power from the grid goes out. Between the power used by the servers and the energy it takes to cool them down—as well as the possibility that sometimes that energy is generated directly by burning diesel—the environmental impact of a data center can be significant. According to the International Energy Agency, in 2020, data centers were responsible for nearly 1 percent of energy-related greenhouse gas emissions.

OpenAI is not forthcoming about the costs of running ChatGPT. However, leaked information and public statements put conservative estimates of ChatGPT’s electricity consumption for just the month of January in 2023 at over four-million kilowatt hours. That’s more than the electricity used by 340 U.S. homes in a full year.

Even when companies provide high-quality power consumption numbers, though, determining the precise climate impact of any given AI model is hard, if not impossible. Leaving aside that the numbers you would need to make such a determination are not publicly released, there are so many more variables than the ones to which I already alluded. But it is safe to say that the climate impact is tangible, and as the popularity of any given model increases, so too will the impact of running it.

The fact that there are so many variables, however, gives us the opportunity to focus on reducing climate impact when we build, train, and use new AI models. Sustainable AI development and use requires finding more efficient ways to train AI models and choosing more climate-friendly data centers, such as those in parts of the world that already use much more renewable energy and ones located in places that aren’t straining limited local water resources.

To prevent new AI models from becoming ecological catastrophes, AI experts need to be intentional about how they design them. Thankfully, some folks in the field have already noticed that the carbon footprint of AI is worth tracking.

In our increasingly digital world, we are encouraged to see technology as frictionless, weightless, and ethereal. Terms like “wireless” and “the cloud” further obscure the reality that our current and future technologies, including widespread AI, rely on vast physical infrastructure powered predominantly by dirty energy sources. When assessing new AI models, we need to evaluate more than their “wow” factor, or the ways model developers hope they will be used. We should also consider whether the tech companies and researchers building AI systems are actively working to reduce their model’s carbon footprint. There is nothing weightless about burning coal to power a data center. And getting developers to prioritize environmental impact during the design process is the best way to keep those impacts from ballooning. This will point us in the right direction toward a more sustainable digital future, while still allowing all of us to enjoy some of that “wow.”

You May Also Like

Unlocking a Just AI Transition (The Thread, 2023): To avoid the worst harms of AI and worsening inequalities, global AI governance needs to prioritize not just safety, but AI justice.

Why We Need Stories of Climate Optimism (The Thread, 2023): Meeting the challenges of climate change will require imagination at every level, from local communities to global institutions.

AI for the People, By the People (The Thread, 2023): Artificial intelligence is everywhere, from writing emails to taking headshots, this new co-gov model activates the general public to weigh in on how AI is used.


Follow The Thread! Subscribe to The Thread monthly newsletter to get the latest in policy, equity, and culture in your inbox the first Tuesday of each month.