Water and energy, the Achilles heel of the generative AI

If we ask ChatgPT if it is sustainable at the environmental level, the answer is that today it is not, although the tool clarifies that the matter is complex and depends on several factors. On the one hand, it summarizes that AI can help design environmental solutions, analyze data or climate forecasts. On the other, he acknowledges that “the training of the models implies using data centers for weeks, with a huge carbon footprint, comparable even with that of thousands of homes.”

In fact, some calculations claim that GPT-3 training consumed 1,287 MWh of electricity and produced 550 tons of CO₂, equivalent to 33 Australian flights to the United Kingdom. To this, in addition, you have to add the subsequent mass use. And it is that two months after the launch of this tool, 100 million active users were already counted.

A question like this report has traveled to a server hosted in a distant data center, where the application has been processed and has used between 114 joules, approximately the equivalent of a microwave heating during a tenth of a second, and July 6,706, sufficient energy to execute that same microwave for eight seconds.

This variation depends on the system used and the questions asked. This same week CSE made public a study by the University of Applied Sciences of Munich (Germany) in which it is concluded that some questions to AI issue up to 50 times more carbon dioxide than others. “The Deep Cogito model (70 billion parameters) obtained the greatest precision (84.9 %), but issued three times more co₂ than others of similar size with more concise responses.” Besides, it specifies, «Having Deepseek R1 (equal amount of parameters) answering 600,000 questions would generate so many CO₂ emissions as a round trip from London to New York. Instead, Gwen 2.5 (72.00 billion parameters) can respond more than three times that amount with similar precision rates and generating the same emissions, ”says the SYNC agency in a report on the subject. In any case my question has gone, I’m afraid, five times more CO2 than if I had just asked Google.

The generative AI adds to the thousand and one interactions that we have with the digital world every day. Behind our Instagram photo, a reel (one of 15 seconds consumes the same energy as publishing eight photos), from a document that is saved in the cloud of our phone or a query to ChatgPT there is a data center; A physical building full of servers. «The cloud and the software we use to communicate and work may seem somewhat ethereal, which have no consistency, but behind there are some infrastructure that support them, the data centers, in clear expansion worldwide. These have high energy consumption and use water to cool these large computers. They need a lot of refrigeration, because these great computers dissipate a lot of heat. How much do these centers pollute now? It depends to start with which country they are and the type of energy they consume, ”says Verónica Bolón, of the Department of Computer Science and Information Technology at the University of La Coruña.

An upward demand

The International Energy Agency predicts that By 2026, the demand for electricity of the data centers will double. In 2022 they consumed around 460 twh. An amount that makes data centers into the eleventh major electricity consumer in the world, between Saudi and France Arabia, according to the Organization for Economic Cooperation and Development. The US leads the data centers list with more than 5,000 facilities, which consume 3% of the country’s electricity. They are followed by Germany, China or the United Kingdom.

Electric consumption is related to cooling and water use. The document “The climatic and sustainability implications of the AI” of the MIT states that for each kwh of energy that consumes a data center, it needs two liters of water to cool. Returning to the US, country with more data centers in the world, In 2023 these facilities drank 66,000 million liters of water, According to the National Lawrence Berkeley laboratory.

In addition, large technological companies have been accused of installing a part of their data centers in arid areas. A sourcematerial study of the Guardian echoes at least 38 centers in areas of water scarcity and another 24 in development. «In 2023 Microsoft said that 42% of its water comes from areas with stress and Google, 15% ». The report also mentions the Amazon project in Aragon licensed to use 755,720 cubic meters of water a year, enough to water 233 hectares of corn, one of the main crops in the region. «It is expected that the new Amazon data centers consume more electricity than the entire region currently needs. Meanwhile, Amazon requested the regional government in December to increase water consumption in its three existing data centers by 48%, ”says the medium.

AI and emissionsA. CruzThe reason

Lower consumption

The concern for the sustainability of AI is general. Many companies are exploring alternatives such as air cooling or the use of special fluids to absorb heat more efficiently, reducing drinking water dependence. The location of the data centers is also being considered, choosing areas with cold climates and establishing facilities near renewable energy sources to optimize consumption.

Another way is to improve the efficiency of algorithms. Is what is known as Green ai or green algorithms, That on the one hand it implies new strategies to make IA models more efficient both during their training and during their execution and, on the other, it focuses on applying to the fight against climate change heard to provide other environmental solutions. «It’s about being a bit consistent. If we are developing an algorithm to detect a serious illness, you have to put all the available resources but when you have an algorithm that generates a better or worse image, or recommends a film, maybe nothing happens to lose a little performance. Another thing that the Green AI considers is to use the algorithms to optimize processes, for example, to predict energy demand peaks or to determine which is the best time to sow, ”says Boldo.

Do they exceed the benefits of AI to its costs, as San Altman, CEO of OpenAi, or on the contrary we are going from optimistic techno? Large technological admit that their consumption sometimes exceed their calculations and try to join renewables or even project their own nuclear plants as a goal has been done. «It is investigated with all kinds of solutions to address the problem of consumption. There are advances in ecological data centers, where for example, residual energy is used to, for example, install a greenhouse next to it. There are even projects to study the viability of bringing data centers to the space or at the bottom of the sea, ”says Boldo. In this sense, few weeks ago one of the main European operators of data centers, Data4 announced a collaboration project with the foundation of the Paris-Saclay University to transform heat into biomass through algae production modules. Data4 estimates that this solution will mean the capture of “up to 13 tons of CO₂ per year by data center, which is a potential of 3,900 tons per year for France.

As explained V on Friday a few months ago, between 2010 and 2018 there was an increase of 550% in computer instances and a 2400% increase in storage capacity in global data centers, but its energy consumption increased only 6%. Enough? Boldo Matiza: «When Deep Seek came out, we saw that it was a more efficient model, but it turned out because its developers did not have access to more advanced software, to the last generation GPUS. That is why the batteries had to be put and look for another way of working. We also see when more small language models appear that decisions are economic or access to resources ».

AI and emissions
AI and emissionsA. CruzThe reason

Trivial tasks

As can be trivial, the question released to Chatgpt at the beginning of this report, a few months ago for a few days all social networks were filled with anime images similar to those created by the mythical Studio Ghibli. In just hours, 200 million liters of water had been consumed. Should we prioritize the use we give to technology? «I give this example many times. The fridge of your house has an energy label and you know what you consume, because you are going to pay electricity. When you are going to buy it you consider if you have A label A, etc. What about algorithms? That you are not paying for that electricity you consume, right? It is something like ethereal and also has two phases. The training phase, which is like something that has already happened, and when you are using it. At first the researchers were more concerned with the training part (between 20 and 40% of consumption), but now With the millions of users, the inference part of use also worries (up to 69% of consumption)», Boldo Codex. MIT researchers agree that it is necessary to start establishing priorities: «The ease of use of generative interfaces and the lack of information on the environmental impacts of my actions means that, as a user, I do not have much incentive to reduce my use of generative. In addition, and on the other hand, artificial intelligence models have a useful life especially short, driven by the growing demand for new applications. Companies launch new models every few weeks, so the energy used to train previous versions ends up wasted, ”says the MIT to end up pointing out that, in addition, new models usually consume more energy for training,” since they usually have more parameters than their predecessors, “they conclude.

Spain hunting for new facilities

In recent months, large data centers have been published, in some cases controversial for the use of water such as the development of the goal, the Zuckerberg multinational, which is already approved in Talavera de la Reina or the aforementioned of Amazon in Aragon. In fact, according to a recent study by Spain DC and Accenture, the demand for data centers will increase in Spain by 90 % until 2028. And warns that, despite the positive growth forecasts, Spain in 2018 contributed only 4.3 % of the volume of data in the EU. To save the distances “you need an average annual growth of data volume from 22.5 % to 2028,” says the study.