This internship proposal aims at studying the energy footprint of machine learning (ML) algorithms and identify relevant levers that could be considered to drastically reduce the impact of ML on the global carbon emission of ICT.
machine learning, energy consumption
Si vous êtes intéressés, contactez les auteurs par mail.
While most of the studies about sustainable AI focus on how machine learning (ML) techniques could improve energy consumption, only a few are looking at the required energy for tuning and using ML models. In a recent paper , authors have estimated that training a large deep learning pipeline produces the same amount of CO2 than 5 cars during their lifetime. The conclusion of this paper raises three recommendations we supplement with remarks and research directions.
Authors should report training time and sensitivity to hyperparameters. As explained by the authors, realizing this will require among others, a standard, hardware-independent mea- surement of training time, such as gigaflops required to convergence. Measuring energy con- sumption at a fine grain is a difficult task but recent evolution of hardware and GPU chips provide novel functionalities in that direction.
Academic researchers need equitable access to computation resources. This requirement is difficult to impose worldwide. An alternative is to develop budgeted learning scenarios where the budget is directly correlated with energy consumption.
Researchers should prioritize computationally efficient hardware and algorithms. A way to address this issue could be to put the measure of energy consumption in the learning loop. Therefore, the design of an energy-based constraint in standard objective functions of learning algorithm can be investigated.
In this internship proposal, we propose the following working line: