tinyML Talks on November 4, 2020 “Introduction to optimization algorithms for compressing neural networks” by Marcus Rüb

We held our next tinyML Talks webcast. Marcus Rüb from Hahn-Schickard Research Institute has presented Introduction to optimization algorithms for compressing neural networks on November 4, 2020.

Forum November 4

Deep neural networks enable state-of-the-art accuracy in visual recognition tasks such as image classification and object recognition. However, modern networks contain millions of learned connections, and the current trend is toward deeper and more tightly connected architectures. This poses a challenge for the deployment of advanced networks on resource-constrained systems such as smartphones or mobile applications. To make neural networks on embedded devices more usable, there are different techniques to compress the models.
In this Talk the most common compression algorithms will be presented and their functionality explained. Among the techniques presented will be pruning, quantization and others.

Marcus Rüb studied electrical engineering at Furtwangen University. After completing his bachelor’s degree, he worked as a scientific assistant for AI at Hahn-Schickard while completing his master’s degree. His main interest is in embedded AI. This often involves the implementation of machine learning algorithms on embedded devices and the compression of ML models. Furthermore Marcus is one of the federal funded AI trainers and supports companies in integrating AI into their processes.


Watch on YouTube:
Marcus Rüb

Download presentation slide:
Marcus Rüb

Feel free to ask your questions on this thread and keep the conversation going!