AI everywhere, but without sky-high energy usage
Making a large AI model available in devices that operate in isolated environments, such as agricultural areas, forests, or oceans. That is, in a nutshell, the mission of Dolly Sapra. She has been appointed assistant professor in the Parallel Computing Systems (PCS) department of the Informatics Institute at LAB42 since late March, focusing on Edge AI. To enable that, energy efficiency is key.
AI offers unprecedented possibilities, but an AI model (such as ChatGPT) will not work on a small device without access to the internet. But areas with low coverage could also benefit from autonomous intelligent systems. Think about forest fire prevention, poacher detection or crop monitoring. The system could be in a small device or in a drone. Also in healthcare, it may be undesirable to send privacy sensitive data to a server. In such situations, Edge AI can be deployed.
Working at the edge
This type of AI is used at the edges of a network, and thus far away from the servers that it normally uses. To make AI work on such devices, considerable modifications are needed. Current AI models are very large, because they need to perform a lot of calculations. As a result, a battery-only device would soon run out of power. By tinkering with the software, AI models should also be able to do their work at the edge.
Personal mission
In addition, a personal mission for Dolly is to make AI models more energy-efficient, not only for edge applications, but also for larger systems connected to servers in data centres. One of Sapra's new lines of research will therefore be to create hybrid AI models that combine current, energy-intensive techniques with older, more energy-efficient techniques.