©Sergey – stock.adobe.com

Energy-Efficient AI

AI – But Without the Electricity Bill

The rapid integration of artificial intelligence into everyday life has a downside: its enormous energy consumption. Data centres already account for several per cent of global energy usage. At the L3S Research Center in Hanover, scientists are therefore exploring ways to make AI systems more energy efficient.

“Resource-efficient innovations in the core elements of today’s AI systems can have a huge impact on global energy consumption,” says Prof. Dr.-Ing. Bodo Rosenhahn, member of the L3S Directorate and Head of the Institute for Information Processing (TNT). Together with researchers from the Institute for Microelectronic Systems (IMS), his team is investigating how to curb AI’s energy appetite – in projects such as GreenAutoML4FAS, PhoenixD, and QuantumFrontiers. TNT focuses on the fundamentals of algorithms and training methods, while IMS develops energy-efficient computing architectures that run AI with minimal power requirements.

Quantum and Photons: Rethinking AI

The usual rule is: the more complex the AI, the higher the energy demand. But the Quantum Machine Learning chair at TNT is pursuing a different approach. Quantum computers are expected to handle particularly computation-heavy AI tasks – and, in turn, be improved by AI themselves. A cycle that could save energy in the long term.

Research goes even further in the field of photonic components. Using classical AI methods, scientists calculate three-dimensional photonic structures, which are then manufactured via 3D printing. These components perform mathematical functions exclusively with light – entirely without electrical energy. In theory, this could enable AI to run entirely on photons in the future.

Computing Architectures for Autonomous Vehicles

Hardware development also focuses on energy efficiency. At L3S, IMS researchers are creating novel, heterogeneous hardware architectures that combine high-performance computing with low energy consumption. One example is the ZuSE-KI-Mobil platform, developed in collaboration with industry and research partners.

It integrates conventional processors (CPUs), AI accelerators, and other specialised components. Key factors include efficient “mapping” of neural networks – optimally distributing computing tasks – and smart scheduling of algorithms. This ensures maximum utilisation of the platform while minimising energy demand.

Energy Efficiency as a Competitive Advantage

“The future AI market will not be determined solely by the most powerful models. Energy efficiency will also become a decisive factor for industry and policy,” says Rosenhahn. Through their research, L3S scientists aim to help Germany remain not only technologically advanced but also ecologically and economically attractive.

Contact

Timo Kaiser, M.Sc.

Research Associate at the Institute for Information Processing, specialising in multiple-object tracking and uncertainty in machine learning.

Research Associate at the Institute for Microelectronic Systems, focusing on performance modelling and optimisation of AI algorithms for AI hardware.

Prof. Dr.-Ing. Bodo Rosenhahn

Director at L3S and Head of the Institute for Information Processing. His research covers computer vision, machine learning, and big data.