_
_
_
_
_

Why data centers want to have their own nuclear reactors

Big technology companies see this energy source as a way to satisfy their extremely high power demand, which has skyrocketed since the advent of AI

Mochovce nuclear power plant
General view of the four cooling towers of the Mochovce nuclear power plant in Slovakia.Janos Kummer (Getty Images)
Manuel G. Pascual

Sam Altman, CEO of OpenAI, the company that created ChatGPT, issued a warning in January at the World Economic Forum in Davos: the artificial intelligence (AI) industry is about to cause an energy crisis. The new generation of generative AI will consume much more energy than expected, he told the world’s leaders and entrepreneurs, to the point of straining global energy grids. “There’s no way to get there without a breakthrough,” he said curtly.

The breakthrough he was alluding to is the so-called advanced nuclear energy, a term that includes small reactors and nuclear fusion, both still in the experimental phase. Several companies are focusing on this alternative, which would provide them with energy autonomy and greater cost control. The Biden Administration does not frown on it, either. In March, Energy Secretary Jennifer Granholm met with representatives of several technology companies, including Amazon, Google and Microsoft, to explore imaginative ways to cover their energy needs. One of the topics discussed was the possibility of using small nuclear reactors in data centers, the extensive warehouses full of processors running day and night.

According to the latest estimates, 8% of the world’s energy is already being used for AI; this energy powers the processors on which the models are trained and the systems are hosted. That figure, as Altman recently predicted, will soon fall short, as new users are added and new versions of ChatGPT, Gemini and Copilot emerge, which will require more and more computing power. “I’m glad he said that in Davos. “I’m glad he said it. I’ve seen consistent downplaying and denial about the AI industry’s environmental costs since I started publishing about them in 2018,” wrote Kate Crawford, one of the leading researchers on the AI footprint, in Nature.

Big tech has already taken the first steps towards the nuclear age, a declining energy source in the West (where plant closures outpace openings) with some major exceptions: the United States, France, the United Kingdom and several Eastern European countries. Companies, for their part, see nuclear power as a way to ensure a stable and lasting supply of energy in a context in which the existing supply is not enough. Senior Google executives told The Wall Street Journal that they are considering signing a power purchase agreement (PPA) with developers of small modular reactors (SMRs). “I do think nuclear, especially advanced nuclear, is making a lot of progress,” said Maud Texier, Google’s global director of clean energy and decarbonization development, in statements to The Wall Street Journal. Texier compared the cost of nuclear projects to where wind and solar were 15 years ago. “Cost decline is going to be a function of deployment,” she said. Company sources did not confirm to EL PAÍS whether the nuclear route is an option for the future, although they did not deny it either. Google recently signed an agreement with Microsoft and Nucor to accelerate advanced clean energy technologies, including “advanced nuclear.”

In October 2023, Microsoft closed PPAs with the American company Helion Energy to have the latter supply it with energy obtained from nuclear fusion starting in 2028. This technique is still more theoretical than practical and, unlike fission, does not produce radioactive waste. Asked by this newspaper about its strategy in the nuclear field, Microsoft alluded to a policy brief from December titled Accelerating a carbon-free future, which makes it clear that advanced nuclear and fusion energy, as well as traditional reactors, are one of the pillars on which Microsoft’s green policy will pivot, although there is no talk of deadlines or dates.

Image of the interior of the US Lawrence Livermore National Laboratory, a facility where nuclear fusion has been achieved.
Image of the interior of the US Lawrence Livermore National Laboratory, a facility where nuclear fusion has been achieved.HANDOUT (AFP)

AWS, Amazon’s cloud computing division, has recently purchased a large data center in the United States located next to the country’s sixth largest nuclear power plant, which supplies it with 100% of its energy at a fixed price. “To complement our wind and solar projects, which depend on weather conditions to generate energy, we are also exploring innovations and technologies and investing in other sources of clean, carbon-free energy. The agreement with Talen Energy [the company that owns the aforementioned US nuclear power plant] for carbon-free energy is a project that goes in that direction,” company sources told EL PAÍS.

Silicon Valley's nuclear flirtation

The idea that nuclear energy is AI’s salvation is catching on among the Silicon Valley jet set. Sam Altman is one of its great supporters. He is so convinced about the good prospects of the proposal from Helion Energy, a pioneer of nuclear fusion, that he has invested $375 million in the company. Altman also chairs a startup, Oklo, that aims to design and manufacture nuclear fission reactors like those used today, but much smaller (the so-called SMRs, short for small modular reactors).

Bill Gates is another technology tycoons with interests in SMRs. His company TerraPower is working on a sodium nuclear reactor, an experimental variant that, if successful, promises to be 25 times cheaper than nuclear fission.

Meta’s chief generative AI engineer, Sergey Edunov, said a few months ago that only two large nuclear reactors would be needed to cover the entire global energy demand projected for 2024 in terms of AI, including powering already operational models and training new ones.

Does the nuclear route have a future? “There are no advances on the horizon that would allow for the immediate deployment of SMRs, which are currently in the initial prototyping phase in numerous countries. This option would only be viable if we are talking about a period of decades,” says engineer Heidy Khlaaf, a specialist in evaluation, specification and verification of complex computer applications in safety-critical systems. Some countries, such as the United Kingdom, France, Canada and the United States, have plans to develop this type of facilities, but not before 20 years.

Khlaaf is especially concerned that Microsoft has put generative AI to work on streamlining the bureaucracy to achieve nuclear licenses, a process that can take years and cost millions of dollars. “This is not a box-ticking exercise, but a process of self-assurance. Considering these regulatory processes as mere cumbersome paperwork says a lot about your understanding, or lack thereof, of nuclear safety,” he says.

Is it realistic to trust the future of AI to nuclear fusion? Helion Energy’s most optimistic estimates say that in 2029 it will be able to produce enough energy to supply 40,000 average homes in the United States. It is estimated that ChatGPT already consumes the equivalent of 33,000 homes today.

Why so much energy consumption?

The emergence of AI has shaken the global energy scene. Most of the consumption associated with generative AI models occurs before they are used, during the training phase. This is a key process in the development of deep learning models that consists of showing the algorithm millions of examples that help it establish patterns with which to predict situations. In the case of large language models, such as ChatGPT, the system is expected to conclude that the series of words “the color of the sea is” has a high probability of being followed by the word “blue.”

Most data centers use advanced processors called GPUs to train AI models. GPUs require a lot of energy to operate, about five times more than CPUs (conventional processors). Training large language models requires tens of thousands of GPUs, which need to operate day and night for weeks or months.

“Large language models have a very large architecture. A machine learning algorithm that helps you choose who to hire might need 50 variables: where the candidate works, what salary they have now, previous experience and so on. The first version of GhatGPT has more than 175 billion parameters,” explains Ana Valdivia, a lecturer in Artificial Intelligence, Government & Policy at the Oxford Internet Institute.

Once the model has been trained, it is necessary to host and exploit the data on which it works. This is also done in data centers, which have to operate day and night.

What is the total consumption of AI? How much energy is dedicated to training and feeding the most commonly used models? Companies don’t publish that information, so all we have are estimates. For example, Google’s Gemini Ultra model, one of the most advanced today, required 50 billion petaFLOPs to train it, according to a recent report from Stanford University. To achieve that kind of computing power with commercial computers (although supercomputers are used in these tasks) would require about 10,000,000,000,000,000 (10 to the power of 16) computers. The cost associated with this training was $191 million, largely attributable to the energy it consumed.

A single AI model can consume tens of thousands of kilowatt-hours. Generative AI models, such as ChatGPT, can have 100 times greater consumption, according to estimates by the technology consulting firm IDC.

Apart from powering the systems themselves, energy also goes to the cooling systems of the processors. The most common techniques include electric ventilation and the use of water to cool the environment and the machines. This latter system is beginning to cause problems in places with water scarcity, although the most modern techniques involve the use of closed circuits that minimize losses of water resources.

Sign up for our weekly newsletter to get more English-language news coverage from EL PAÍS USA Edition

More information

Archived In

Recomendaciones EL PAÍS
Recomendaciones EL PAÍS
_
_