Two tinyML Talks on June 16, 2020: 1) "Low-Power Computer Vision” by Yung-Hsiang Lu (Purdue University); 2) “Saving 95% of your edge power with Sparsity to enable tiny ML” by Jon Tapson (GrAI Matter Labs)

We held our eighth tinyML Talks webcast with two presentations:
Yung-Hsiang Lu from Purdue University has presented Low-Power Computer Vision and Jon Tapson from GrAI Matter Labs has presented Saving 95% of your edge power with Sparsity to enable tiny ML on June 16, 2020 at 8:00 AM and 08:30 AM Pacific Time.


Jon Tapson (left) and Yung-Hsiang Lu (right)

Computer vision has been widely adopted. Many applications require that vision solutions run on battery-powered systems, such as mobile phones, autonomous robots, and drones. This presentation will survey the existing technologies for making computer vision energy-efficient, including (1) parameter quantization and pruning, (2) compressed convolutional filters and matrix factorization, (3) network architecture search, and (4) knowledge distillation. The speaker will explain how to use hierarchical neural networks to reduce energy consumption on embedded systems. Finally, this speech will introduce the IEEE International Low-Power Computer Vision Challenge (for 2020, the competition is open on July 1-31, please visit https://lpcv.ai/).

Yung-Hsiang Lu is a professor at the School of Electrical and Computer Engineering (and courtesy at the Department of Computer Science) of Purdue University, West Lafayette, Indiana, USA. Dr. Lu’s research topics include energy-efficiency computing, computer vision, mobile and cloud computing. He is an ACM Distinguished Scientist (2013) and ACM Distinguished Speaker (2013-2016). He is the lead organizer of the IEEE International Low-Power Computer Vision Challenge. Dr. Lu is the inaugural director of Purdue’s John Martinson Entrepreneurship. Students from his research team started two technology companies using computer vision for retail stores and raised more than $1.3M SBIR (Small Business Innovation Research) grants. In 2019, he received the Outstanding Entrepreneurship Award from the VIP (Vertically Integrated Projects) Consortium. Dr. Lu obtain PhD. from the Department of Electrical Engineering of Stanford University, California, USA.

The kind of tasks for which ML is used at the edge are different than those for ML in the datacenter. Specifically, they tend to be continuous real-time processes with streaming data, on which inference must be performed in each sampling interval. In this talk, we will describe how this type of process offers significant possibilities for reducing the computation needed that can be exploited as very low latency and/or very low power required for enabling tiny machine learning tasks. If we make a conscious effort to exploit multiple types of sparsity, we can drive significant advances in edge processing. We will explain these types of sparsity (time, space, connectivity, activation) in terms of edge processes, and how they affect computation on a practical level. We will introduce the new GrAI Core architecture, and explain how it uses an event-based paradigm to maximally exploit sparsity and save energy in edge inference loads or improve latency relevant to tiny machine learning applications. The results will be illustrated with some examples of real-world applications in which GrAI core chips are being used.

Jonathan Tapson is the Chief Scientific Officer of GrAI Matter Labs. Prior to this, he was the Director of the MARCS Institute for Brain, Behaviour and Development at the University of Western Sydney, and has held positions at Dean and Head of Department levels in multiple universities. His research covers neuromorphic engineering and bio-inspired sensors, and he has authored over 160 papers and a dozen patents.

==========================

Watch on YouTube:
Yung-Hsiang Lu
Jon Tapson

Download presentation slide:
Yung-Hsiang Lu
Jon Tapson

Feel free to ask your questions on this thread and keep the conversation going!