_
_
_
_
_

The dirty secret of artificial intelligence

The emergence of tools such as ChatGPT triggers warnings about a fivefold increase in global data center energy consumption

ChatGPT
A water cooling system in Google’s data center in The Dalles, Oregon (USA).

Everyday activities like using a GPS to map out the best driving route or translating a document consume energy, water and mineral resources — lots of it. These applications run in the cloud, a nebulous term for the millions of powerful computers in vast data centers worldwide. Mobile applications depend on legions of computers to store trillions of data and perform split-second operations (e.g. calculating travel time based on distance and traffic volume). Estimates of the energy consumption of data centers range between 1-2% of total global consumption. All signs indicate that data center energy consumption is about to skyrocket.

Generative artificial intelligence (AI), the technology underlying intelligent chatbots like ChatGPT and the tools that generate original artwork and music from a user’s text prompts, requires a lot of computing power. Microsoft, Google and other big tech companies are now integrating AI functionality into search engines, text editors and email. Our relationship with the applications we use every day is about to change. For years, we used to enter a series of commands to carry out certain activities. In the near future, we will find ourselves conversing with our computers, asking them to perform tasks we used to do ourselves.

What effect will this paradigm shift have on the environment? No one knows for sure, but most people think the impacts will be dramatic. “AI may seem ethereal, but it is physically reshaping the world,” says Kate Crawford in her 2022 book, The Atlas of AI. The Australian senior researcher at Microsoft Research and the AI Now Institute director warned us two years ago that this technology’s “planetary costs” are constantly growing. Four years ago, some scientists estimated that the technology sector would account for 14% of global emissions by 2040; others predicted that data center energy demand would increase by a factor of 15 by 2030.

All these forecasts may fall short because they predate the emergence of ChatGPT. Google and Microsoft have hundreds of millions of users — what will happen if they all start using tools supported by generative AI? Canadian Martin Bouchard, the co-founder of digital infrastructure company QScale, believes that each query based on generative AI will require at least four to five times more computing power. When we asked about their current and future energy consumption in the era of generative AI, Google and Microsoft did not provide specific numbers. They only reiterated their goal of achieving carbon neutrality by 2030. For Crawford, that “means they offset emissions by buying credits from other people” through compensatory activities like planting trees.

One of Google's data centers in Douglas, Georgia
Rows of servers in Google’s data center in Douglas, Georgia (USA).

How much does AI pollute?

“Generative AI produces more emissions than ordinary search engines [which also consume a lot of energy] because they are complex systems that crawl through millions of web pages,” says Carlos Gómez Rodríguez, a professor of computer science and artificial intelligence at the University of La Coruña (Spain). “But AI generates even more emissions than search engines because it uses neural network architectures with millions of parameters that must be trained.”

A couple of years ago, the carbon footprint of the computer industry caught up with aviation. Training a natural language processing model produces emissions equivalent to the lifetimes of five gasoline-burning cars, from the factory to the junkyard. Or 125 round-trip flights between Beijing and New York. But there are other natural resource costs. A study published in Nature found that Google required 4.17 billion gallons (15.8 billion liters) of water to cool its data centers in 2021, while Microsoft used almost a billion gallons (3.6 billion liters). Rare metals are extracted from mines worldwide and used to make electronic components. In short, AI is a technology with major environmental impacts.

Training a natural language processing model produces emissions equivalent to the lifetimes of five gasoline-burning cars, from the factory to the junkyard

There is no data on how much and what kind of energy the big tech companies consume, the only ones with an infrastructure robust enough to train and power the large language models on which generative AI relies. There are also no published data on the water consumption of data center cooling systems, an issue already causing tension in countries like the U.S., Germany and the Netherlands. Companies are not required to provide this information. “What we have are estimates. For example, training GPT3, the underlying model for ChatGPT, could have generated about 500 tons of carbon, the equivalent of traveling by car to the Moon and back. It may not seem like much, but you should know that the model has to be periodically retrained to incorporate updated data,” says Gómez. OpenAI recently unveiled GPT4, its more advanced model. And the race goes on.

Another estimate indicates that OpenAI’s (the organization that created ChatGPT) electricity consumption in January 2023 could equal the annual consumption of 175,000 Danish households, which are not the biggest consumers. “These are projections based on the current usage of ChatGPT. If it becomes even more widespread, we could be talking about the equivalent electricity consumption of millions of people,” says Gómez.

Google data center in Saint-Ghislain, Belgium
Aerial view of Google’s data center in Saint-Ghislain, Belgium.

The high cost of training algorithms

The lack of data will soon be resolved. The European Union (E.U.) is aware of the growing energy consumption of data centers and is developing a directive for consideration next year (which means it will be at least two years before it takes effect) that establishes requirements for energy efficiency and transparency. The U.S. is working on a similar regulation.

“There are three sources of AI carbon emissions: the hardware used, the carbon intensity of the energy source, and the energy consumed to train the model,” says Alex Hernández, a postdoctoral researcher at the Quebec Artificial Intelligence Institute (MILA).

Most of the emissions are produced during AI model training. Training is fundamental to developing machine-learning models, the AI modality that has grown the fastest recently. Millions of examples are presented to the algorithm to establish patterns that enable it to predict situations. For example, large language models are trained on text samples, so it knows that “round” is the most likely word that follows “the Earth is.”

OpenAI’s electricity consumption in January 2023 could equal the annual consumption of 175,000 Danish households

Most data centers use advanced, energy-intensive processors called GPUs to train AI models. Training large language models requires tens of thousands of GPUs that run day and night for weeks or even months, according to a recent report by Morgan Stanley.

“Large language models have expansive architectures. A machine learning algorithm that helps you choose who to hire maybe needs 50 variables: where candidates currently work, their salaries, previous experience, and so on. ChatGPT has more than 175 billion parameters,” says Ana Valdivia, a postdoctoral researcher in computing and AI at King’s College London. “You have to train that enormous structure and store and process all the data. Data storage also consumes resources.”

Alex Hernández recently presented a paper analyzing the energy consumption of 95 models. “The hardware they all use is very similar. But suppose you train your model in Quebec, where most electricity is hydroelectric. In that case, you reduce carbon emissions by a factor of 100 or more compared to places where electricity is generated from coal, gas and other sources.” It is estimated that 73% of China’s data centers are powered by coal power plants that produced at least 100 million tons of CO₂ in 2018.

Yoshua Bengio, whose work on deep neural networks earned him the Turing Award (the top global award for computer science), led a team at MILA that developed Code Carbon, a tool for measuring the carbon footprint of algorithm development and training. The goal is to have IT professionals integrate it into their code so they can measure emissions and make programming decisions.

More computing capacity

In 2018, OpenAI conducted a study warning of the need to prepare for the day when much higher-capacity systems are needed. The study found that the computing capacity to train the largest AI models doubles every three to four months. This is much faster than Moore’s Law, which established that the number of transistors in an integrated circuit doubles about every two years. “Considering the models being trained, more computing capacity is needed to operate them. The big technology companies are probably already buying more servers,” says Gómez.

Hernández is less concerned about the emissions from operating AI models. “There is a lot of research aimed at reducing the number of parameters and complexity of the models, which will consume less energy. However, fewer opportunities exist to reduce energy consumption during training because this requires sustained and intensive use. The former is relatively easy to optimize; the latter, not so much.”

A possible solution to make training less polluting would be reducing the complexity of the algorithms without sacrificing efficiency. “Does it really take millions of parameters to build good models?” asks Valdivia. “For example, many biases have been discovered in ChatGPT. Research is underway to achieve the same results with simpler architectures.”

Sign up for our weekly newsletter to get more English-language news coverage from EL PAÍS USA Edition

More information

Archived In

Recomendaciones EL PAÍS
Recomendaciones EL PAÍS
_
_