We held our next tinyML Talks webcast. Thomas Elsken from Bosch Center for Artificial Intelligence has presented Efficient Multi-Objective Neural Architecture Search with Evolutionary Algorithms on March 10, 2021.
Deep Learning has enabled remarkable progress over the last years on a variety of tasks such as image recognition. One crucial aspect for this progress are novel neural architectures. Currently employed architectures have mostly been developed manually by human experts, which is a time-consuming and error-prone process. This led to a growing interest in neural architecture search (NAS), the process of automatically finding neural network architectures for a task at hand. While recent approaches have achieved state-of-the-art predictive performance, they are problematic under resource constraints for two reasons: (1) the neural architectures found are typically solely optimized for high predictive performance, without penalizing excessive resource consumption; (2) most architecture search methods require vast computational resources in the order of hundreds of thousands of GPU days. After giving a short introduction to NAS, we address these by proposing LEMONADE, an evolutionary algorithm for multi-objective architecture search that allows approximating the entire Pareto front of architectures under multiple objectives, such as predictive performance and resource consumption. LEMONADE employs an inheritance mechanism for neural architectures to generates child networks that are warm started with the predictive performance of their trained parents in order to overcome the need for immense computational resources.
Thomas Elsken is a Research Engineer at the Bosch Center for Artificial Intelligence. Thomas received his Master’s degree in Mathematics at the University of Muenster and did his PhD on neural architecture search at the University of Freiburg and the Bosch Center for Artificial Intelligence under the supervision of Prof. Frank Hutter. Thomas’ research interests lie in automated machine learning. His work focuses on automatically designing neural network architectures, also known as neural architecture search. More recently, Thomas gained interest in meta learning and how meta learning algorithms can be combined with neural architecture search methods in order to make them more efficient and practical.
Watch on YouTube:
Download presentation slide:
Feel free to ask your questions on this thread and keep the conversation going!