Between energy lovers and sustainability savers

The further development of artificial intelligence is proceeding at a rapid pace. Technology can make a decisive contribution to social well-being – for healthcare, mobility or combating climate change. But for AI to contribute to greater sustainability, energy consumption must be improved.

Artificial intelligence (AI) has the potential to help solve some of society’s toughest problems, including climate change. At the same time, artificial intelligence and genetic artificial intelligence in particular require enormous computing power and therefore a lot of electricity. Currently, the computing power required for AI models doubles every five or six months, so as the demand for this technology grows, power consumption will continue to rise. Data centers already consume up to 1.5 percent of global electricity needs and have about the same amount of CO2 globally2trace like global air traffic. In Germany itself, energy requirements are over 18 billion kilowatt-hours per year, which corresponds to about 0.55 percent of total energy consumption. This is particularly worrying when you consider that around 75 percent of greenhouse gas emissions in the EU are caused by high electricity consumption.

Gartner’s latest findings predict that AI could help reduce global greenhouse gas emissions by five to ten percent by 2030. At the same time, however, it is expected that AI could account for up to 3, 5 percent of global electricity consumption by the same year. The technology industry now faces a clear challenge: to fully exploit the technology’s potential in such a way that the energy demands of artificial intelligence do not consume the benefits.

AI power consumption

The energy consumption of artificial intelligence consists of two factors: energy is consumed both during model training and during inference. Training involves iteratively improving models through data, while inference involves processing live data through an already trained AI model to solve specific tasks. According to a research report, inference can account for at least 60 percent of the energy consumption of productive AI, while incorporating AI capabilities into web search queries, for example, can also increase energy requirements tenfold.

When using a search engine, the user enters a single query, while with a production model, the user enters a dialog to achieve the desired result. This fact leads to more queries overall compared to using a traditional search engine.

As new applications emerge for building artificial intelligence in the text, image and video domains, an increase in large-scale models that are trained, retrained and improved on a daily basis is also expected. Current AI production models require more than a 200x increase in computing power for training compared to previous generations. Each new generation of models requires additional computing power for inference and increases energy requirements for training. It is a continuous cycle that constantly places additional demands on the required infrastructure.

In terms of hardware, the graphics processing units (GPUs) used for artificial intelligence require significantly more power than a traditional Processor-System. Current GPUs can consume up to 700 watts, and an average configuration requires eight GPUs per servant. This leads to a servant almost six kilowatts are consumed, as opposed to one kilowatt for a conventional 2-Processorservers that companies use for the Virtualization enter. Therefore, the process needs to be made more sustainable.

Technological adjustment screws

The first step is to understand that sustainability is an ongoing process. There is no measure that can suddenly make AI viable. However, small steps can make a big difference. Manufacturers know they need to develop more powerful products that use fewer resources at the same time. This call comes from both consumers and investors and increasingly from governments. Energy efficiency will be not only an ethical but also a legal requirement for AI organizations in the future. Recent changes to EU AI law require operators to adopt advanced methods to reduce energy consumption and improve the performance of their AI platforms.

Technically, this can be achieved at three different levels: first, by optimizing the chips that generate the computing power, second, by developing environmentally friendly computers tailored to these chips, and third, by improving efficiency in data centers . Sustainability is increasingly becoming a competitive advantage for chip and computer manufacturers, especially as companies strive to meet their ESG goals. In the coming decades, new developments such as analog chips could provide an energy-efficient alternative that is particularly well-suited to neural networks, current research suggests.

In data centers, older air cooling technologies are already reaching their limits to meet the high energy demands of artificial intelligence. Therefore, customers are increasingly turning to liquid cooling to minimize energy consumption. These include, for example, the Leibniz Computing Center (LRZ), which has been using water cooling for over ten years. Other institutions, such as the Karlsruhe Institute of Technology (KIT) and the Zuse Institute Berlin (ZIB), also use water-heated GPUs. In addition, the new supercomputer at the Potsdam Institute for Climate Impact Research (PIK) will also receive hot water-cooled artificial intelligence systems. By efficiently transferring the heat produced by productive AI to water, up to 30 to 40 percent of electricity can be saved and at the same time computing power can be increased through this better cooling.

A growing number of customers are showing interest in such systems trend driven by sustainability and cost objectives. This interest is reflected by both cloud service providers and data center colocation providers. The increased requests reflect a clear move towards more environmentally friendly and cost-effective cooling methods in the industry.

The use of data centers powered by renewable energy sources will also be critical to reducing CO2-Reduce AI footprint. As-a-service approaches to AI technology can also help minimize resource wastage through sharing and ensure organizations use the latest, most sustainable hardware without having to make direct capital expenditures.

With artificial intelligence in sustainability

Many are using AI to benefit society, for example by improving medical research or studying climate change more accurately. Others, however, use them for recreational purposes, for example. This raises the question of whether these different energy requirements should be treated differently.

There is no doubt that AI has significant potential to bring about positive change and is already having an impact in many sectors. There are many examples of how artificial intelligence has the potential to mitigate the effects of climate change. The UN emphasizes that artificial intelligence not only helps to better predict and understand extreme weather events, but can also provide immediate support to affected communities.

In addition, AI can provide new understanding of our environment, which in turn could help reduce greenhouse gas emissions. In smart cities, emissions could be reduced by using artificial intelligence to reduce the running time of heating and air conditioning systems. The technology tracks residents’ habits and can turn down the heating or air conditioning before they leave their homes. It can also control traffic in a city to enable efficient driving and avoid traffic jams. Norwegian start-up relies on artificial intelligence to explore the depths of the ocean. It predicts the movement of currents to control pollution and helps ships reduce their gasoline consumption.

There is no doubt that artificial intelligence requires significant amounts of energy. Therefore, this problem must be gradually addressed by relying on hot water cooling rather than air cooling, using renewable energy sources to power data centers, and encouraging innovation in chip and computer technology.

Andreas Thomasch, Director HPC & AI DACH, France, UKI at Lenovo

Read more about the topic

You may also be interested

Order your free newsletter now!

More articles about Lenovo Germany

More Articles About Artificial Intelligence (AI)

More articles on sustainability

Source link

Leave a Comment