tinyML Talks on April 20, 2023 “Low Precision Inference and Training for Deep Neural Networks” by Philip Leong from University of Sydney

We held our next tinyML Talks webcast. Philip Leon from University of Sydney presented Low Precision Inference and Training for Deep Neural Networks on April 20, 2023.

April 18 forum global

In this talk we present Block Minifloat (BM) arithmetic, a parameterised minifloat format which is optimised for low-precision deep learning applications. While standard floating-point representations have two degrees of freedom, the exponent and mantissa, BM exposes an additional exponent bias allowing the range of a block to be controlled. Results for inference, training and transfer learning using 4-8 bit precision which achieve similar accuracy to floating point will be presented.

Philip Leong received the B.Sc., B.E. and Ph.D. degrees from the University of Sydney. In 1993 he was a consultant to ST Microelectronics in Milan, Italy working on advanced flash memory-based integrated circuit design. From 1997-2009 he was with the Chinese University of Hong Kong. He is currently Professor of Computer Systems in the School of Electrical and Information Engineering at the University of Sydney, Visiting Professor at Imperial College, and Chief Technology Officer at CruxML Pty Ltd.

=========================

Watch on YouTube:
Philip Leon

Download presentation slides:
Philip Leon

Feel free to ask your questions on this thread and keep the conversation going!

1 Like