国产视频

The Thread

How Your Conversation with ChatGPT Contributes to Climate Change

Shutterstock_2261965633 (1)
Domenico Fornas/Shutterstock

Since ChatGPT was unleashed upon humanity, tech companies have raced to put artificial intelligence (AI) into their offerings, prompting Microsoft to call 2023 鈥.鈥 Recent reports of Google鈥檚 decision to and in Google Image searches exemplifies a pervasive trend: AI is everywhere. To be sure, the ability to generate text, images, and videos with just a few prompts鈥攁s new AI models allow you to do鈥攊s a technological marvel, but beneath the simplicity of that text box lies the massive physical reality and environmental footprint of the resources required to run these AI models.

Building a model like GPT can be , and those costs to the resources needed to actually run it. AI鈥檚 power consumption comes in two phases: the energy required to train the model, and the energy required to run the model.

But what is the exact energy toll of training and running AI systems? While many AI developers don鈥檛 share numbers on the carbon footprint of training models, Meta gave us a sense when it shared its open-sourced model, LLaMA. According to Meta鈥檚 numbers, training this large language model (LLM)鈥攖he same kind of AI used for things like ChatGPT鈥攑roduced an estimated 539 tons of carbon dioxide (CO2). That is equivalent to the electricity used by in a year. Training GPT-3, a predecessor of ChatGPT, is estimated to have used .

That seems like a lot of energy for one AI model, but that only accounts for the training and not the energy it costs to run these models, especially when power demands increase as people actually use them. To better understand why AI has such a huge carbon footprint, it is helpful to think about the heavy infrastructure required for such tech to work.

To prevent new AI models from becoming ecological catastrophes, AI experts need to be intentional about how they design them.

Most modern AI setups involve running models on several servers living in one or more data centers. A typical server built for AI contains multiple Graphics Processing Units (GPUs)鈥攁 type of chip originally developed for video game graphics that has . These optimized servers use a lot of power and generate a lot of heat, and just like your laptop, servers will produce more heat when they are used more intensely.

In a data center, these power-hungry heaters are stacked in six- to eight-foot tall towers organized in rows. These massive stacks must be aggressively cooled, lest it all catch on fire. Cooling the servers requires numerous fans attached to the servers themselves, along with blasts of air conditioning (or swamp cooling) for the entire building鈥攍eaving the inside of these data centers loud and cold. Because servers are supposed to be running 24/7, data centers also tend to have large, onsite diesel generators that kick in when power from the grid goes out. Between the power used by the servers and the energy it takes to cool them down鈥攁s well as the possibility that sometimes that energy is generated directly by burning diesel鈥攖he environmental impact of a data center can be significant. According to the International Energy Agency, in 2020, data centers were responsible for nearly .

OpenAI is not forthcoming about the costs of running ChatGPT. However, leaked information and public statements put conservative estimates of ChatGPT鈥檚 electricity consumption for just the month of January in 2023 at . That鈥檚 more than the electricity used by 340 U.S. homes in a full year.

Even when companies provide high-quality power consumption numbers, though, determining the precise climate impact of any given AI model is hard, if not impossible. Leaving aside that the numbers you would need to make such a determination are not publicly released, there are so many more variables than the ones to which I already alluded. But it is safe to say that the climate impact is tangible, and as the popularity of any given model increases, so too will the impact of running it.

The fact that there are so many variables, however, gives us the opportunity to focus on reducing climate impact when we build, train, and use new AI models. Sustainable AI development and use requires finding AI models and choosing more climate-friendly data centers, such as those in parts of the world that already and ones located in places that aren鈥檛 .

To prevent new AI models from becoming ecological catastrophes, AI experts need to be intentional about how they design them. Thankfully, some folks in the field have already noticed that .

In our increasingly digital world, we are encouraged to see technology as frictionless, weightless, and ethereal. Terms like 鈥渨ireless鈥 and 鈥渢he cloud鈥 further obscure the reality that our current and future technologies, including widespread AI, rely on vast physical infrastructure powered predominantly by dirty energy sources. When assessing new AI models, we need to evaluate more than their 鈥渨ow鈥 factor, or the ways model developers hope they will be used. We should also consider whether the tech companies and researchers building AI systems are actively working to reduce their model鈥檚 carbon footprint. There is nothing weightless about burning coal to power a data center. And getting developers to prioritize environmental impact during the design process is the best way to keep those impacts from ballooning. This will point us in the right direction toward a more sustainable digital future, while still allowing all of us to enjoy some of that 鈥渨ow.鈥

More 国产视频 the Authors

nat-meysenburg_person_image.jpeg
Nat Meysenburg

Technologist, Open Technology Institute

Programs/Projects/Initiatives

How Your Conversation with ChatGPT Contributes to Climate Change