We held our next tinyML Talks webcast. Swarnava Dey from TCS Research presented Neural Architecture Search for Tiny Devices on April 10, 2023.
It is widely anticipated that inference models based on Deep Neural Networks (DNN) will be actively deployed in many edge platforms. This has promoted research in automated learning of tiny neural architectures through search. Although NAS was proposed in 2016, the NAS research is focused on fast search of DNN architectures that surpass the performance of human-designed ones. Apart from the above primary target of enhancing the NAS process itself, many people use NAS for generating and customizing DNN models, given a target hardware. In recent times this has become very important for embedded Deep Neural Networks that need to meet platform specific constraints and various objectives, such as low latency, low memory footprint and low power consumption. Neural Architecture Search (NAS) can provide both efficient and accurate, customized models for the target architecture. However, the existing frameworks either provide a). mechanisms for fast accurate model generation or b) slow but both accurate and efficient model generation, but not both. Towards This, the current tutorial explains the basic NAS process and the mathematical model behind the search, which makes it easy for the TinyML engineers to tweak existing NAS frameworks in an informed manner.
Swarnava Dey is a Senior Scientist at TCS Research working on embedded vision systems. He is an M.Tech from IIT, Kharagpur, and currently pursuing PhD there in robustness, verifiability and explainability of Embedded Deep Neural Networks and Neuro Symbolic AI. He has 30+ granted patents, 25+ research papers, and is an author of Towards Data Science: Swarnava Dey – Medium. His publication details can be found at his Google Scholar page: Swarnava Dey - Google Scholar
Watch on YouTube:
Download presentation slides:
Feel free to ask your questions on this thread and keep the conversation going!