According to the IAE, AI consumption is overrated and we should worry more about air conditioning.
Tools like ChatGPT, AI applications like Photoshop, or image creation tools like Midjourney are ideal for some workflows. However, even those that are free are extremely expensive. Meeting our demands requires large amounts of computing power, resulting in high demands for energy and water to cool data centers.
It is expected that the use of AI will lead to a sharp increase in global energy consumption (so much so that there are companies like Google or Meta that are going to use nuclear energy to meet their needs). However, the International Energy Agency makes it clear that we are overestimating AI consumption. The real problem will be global warming.
More than entire countries. Artificial intelligence has been with us for a long time, but it has never been as accessible and, therefore, in demand as it is until now. For a long time we had access to algorithms that allowed us to optimize processes, but the advent of generative AI has increased energy consumption by 10 times. The two biggest players are Microsoft and Google, which each reported energy consumption of 24 TWh in 2023.
This fact may not say much, but they already consume more energy than 100 different countries. In fact, there are those who noted that both – individually – were between Libya and Azerbaijan in terms of energy consumption.
What the IEA says. The International Energy Agency itself has repeatedly talked about the high consumption of AI – data centers – and how it will increase in the short term due to increased demand for artificial intelligence-based systems. However, energy requirements may also be overestimated.
According to the latest World Energy Outlook report, the IEA said that while investment in artificial intelligence is increasing, hardware will become increasingly efficient (more tasks consume less energy) and, in addition, data center energy demand will be lower than last year. as in other industries. Actually much less.
Air conditioner. According to data managed by the IEA, the demand for artificial intelligence data centers until 2030 will be around 202.8 TWh, which is in line with the expectations of desalination systems (which also consume a lot of energy to be able to offer potable water). behind other industries such as air conditioning or electric vehicles. In particular, the expected 3% increase in data center demand is only a third of what will be needed to cool facilities in 2030.
The estimated consumption of the air conditioner is 676 TWh. And yes, data centers’ energy needs will also be less than 473.2 TWh for space heating during the colder months. The IEA comments as follows:
“Globally, data centers will account for a relatively small share of overall electricity demand growth through 2030. More frequent and intense heat waves than expected, or more stringent performance standards applied to new appliances, especially air conditioners, are causing projected electricity demand to fluctuate significantly more than optimistic forecasts. scenario for data centers. Rising global temperatures will increase demand for more than 1,200 TWh of additional cooling energy worldwide. 2035 in STEPS, that’s more electricity than the entire Middle East consumes today.”
This is still a high expense. Just because AI developments won’t be catastrophic for global energy demand doesn’t mean it isn’t very high. So much so that the IEA itself convened a world summit to discuss how to counter the rise of AI. This is what will be celebrated on December 5th in Paris, and various industry figures will meet.
Perhaps the problem is not only energy. On the other hand, while it is obvious that AI consumes a lot of resources, it also generates something: a huge amount of electronic waste. AI spending is estimated to increase eightfold between 2022 and 2023, with much of that money going toward building and equipping data centers.
In this case, one of the key points is not only the number of calculation systems, but also their technology. This means companies are throwing out old hardware in favor of buying the latest GPUs from manufacturers like Nvidia, creating a huge amount of waste.
It’s hard to reuse a GPU that’s been running calculations around the clock, but there are those who suggest using that “old” hardware for CPU-intensive tasks for other, less demanding ones, like website hosting, backup, or your donation to educational centers.
Images | Hataka, Garcelor
In Hatak | Samsung is putting the handbrake on: postponing the construction of its state-of-the-art factories in the USA and South Korea