We held our next tinyML Talks webcast. Andrii Polukhin from Data Science UA presented Lightweight Neural Network Architectures on January 16, 2023.
Have you ever wondered what makes neural networks, like MobileNetV3, FBNet, and BlazeFace, so special? These networks may be found in commonplace items like our phone or TV.
Their primary challenge—and area of interest—lies in the efficient creation of a neural network for low-power devices.
This lecture will cover the following topics:
- the rationale for the design of layers like Fire module and Squeeze-and-Excitation;
- the best methods for determining the number of model parameters, width, and depth of architecture layers, such as EfficientNet or Model Rubik’s Cube algorithms;
- the SOTA solutions that will enable you to accelerate and optimize certain layers of your neural network.
Those who want to understand how to create a lightweight, effective neural network will find this session to be interesting.
The main areas of Andrii’s interests are the research of Deep Learning architectures, methods of training deep networks, intuition and mathematical theorems behind it, interaction of AI with the surrounding world.
=========================
Watch on YouTube:
Andrii Polukhin
Download presentation slides:
Andrii Polukhin
Feel free to ask your questions on this thread and keep the conversation going!