tinyML Talks on February 27, 2023 “From the lab to the edge: Post-Training Compression” by Edouard Yvinec from Datakalab

We held our next tinyML Talks webcast. Edouard Yvinec from Datakalab presented From the lab to the edge: Post-Training Compression on February 27, 2023.

February 27 global forum

Deep neural networks (DNNs) are nowadays ubiquitous in many domains such as computer vision. However, going from tensorflow or torch to efficient DNN deployments on the edge remains one of the industry’s biggest remaining challenges. During this presentation, we will see how Datakalab solves this problem, without using intensive computations nor re-training on the cloud, in two steps. First, we remain agnostic of the training framework by providing support for the inference of any DNN on a wide range of hardware. Second, we designed custom, state of the art, compression techniques that trely on post training quantization, pruning and context adaptation. The resulting inference models achieve remarkable speeds on microchips while staying within an accuracy loss of less than 1%.

Edouard Yvinec is a PhD student at Datakalab and Sorbonne Université. Neural networks compression is his main research interest which he applies to solving computer vision and nlp tasks. He published several works on post-training compression at NeurIPS [1,2] en pruning, ICLR [3] and WACV [4] on quantization and at IJCAI [5] on layer folding. Each of these methods focus on computer vision tasks solved by convnets to the exception of PowerQuant [3] which also tackles transformer compression for both vision and nlp.

=========================

Watch on YouTube:
Edouard Yvinec

Download presentation slides:
Edouard Yvinec

Feel free to ask your questions on this thread and keep the conversation going!