CSE researchers receive Mozilla funding for research on AI energy use

The researchers were selected as recipients of the 2024 Mozilla Technology Fund for Zeus, an effort to measure and optimize the energy consumption of machine learning.
The zeus logo in white against a dark blue background

Researchers in CSE have been selected for the 2024 cohort of the Mozilla Technology Fund, which aims to support the development of innovative technological solutions to pressing global problems. This year, Mozilla tasked researchers with developing open-source AI tools that address issues related to the environment, climate change, and energy.

According to Mehan Jayasuriya, Senior Program Officer at Mozilla, “These awardees are leading the way in demonstrating how open source AI can be used to help, not harm, the environment.”

Jae-Won Chung
CSE PhD student Jae-Won Chung

The Zeus team, led by PhD student Jae-Won Chung with advisor Prof. Mosharaf Chowdhury, has answered this challenge by leveraging the latest optimization tools to better assess and minimize the energy consumption of machine learning systems. 

The tech used to power modern artificial intelligence (AI) applications is an increasingly large contributor to global energy consumption. Huge amounts of computing power are needed to train and run large AI models, and more computation means more energy. According to the Department of Energy, data centers consume up to 50 times more energy than an average commercial building, and these numbers are only increasing as applications like large language models (LLMs) and other AI tools grow in prevalence and size.

As an open-source framework aiming to address machine learning’s growing environmental footprint, Zeus provides two key innovations: tools for easily and accurately measuring the energy consumption of deep learning workloads, and an array of methods for automatically optimizing energy efficiency based on those measurements.

To accomplish this, Zeus locates the ideal settings in both deep learning models and hardware to optimize energy use without sacrificing speed. This includes tuning traits such as training batch size, which refers to the number of data samples an AI model learns in each iteration of training, the GPU’s power limit, and the GPU’s core frequency over time.

As a Mozilla Technology Fund awardee, the Zeus team will receive $50,000 in funding for their continued research and development. Going forward, their research will focus on making these energy measurement and optimization tools more robust and expanding support for various deep learning software frameworks and hardware accelerators.

The code for Zeus is available on GitHub. More information about Zeus can also be found in this 2023 story, Optimization could cut the carbon footprint of AI training by up to 75%.