tinyML Talks on May 13, 2021 “SRAM based In-Memory Computing for Energy-Efficient AI Inference” by Jae-sun Seo

We held our next tinyML Talks webcast. Jae-sun Seo from Arizona State University presented SRAM based In-Memory Computing for Energy-Efficient AI Inference on May 13, 2021.

May 13 forum

Artificial intelligence (AI) and deep learning have been successful across many practical applications, but state-of-the-art algorithms require enormous amount of computation, memory, and on-/off-chip communication. To bring expensive algorithms to a low-power processor, a number of digital CMOS ASIC solutions have been previously proposed, but limitations still exist on memory access and footprint.

To improve upon the conventional row-by-row operation of memories, “in-memory computing” designs have been proposed, which performs analog computation inside memory arrays by asserting multiple or all rows simultaneously. This talk will present recent silicon demonstrations of SRAM-based in-memory computing for AI systems. New memory bitcell circuits, peripheral circuits, architectures, and a modeling framework for design parameter optimization will be discussed.

Jae-sun Seo is an Associate Professor at the School of ECEE at Arizona State University. His research interests include efficient hardware design of machine learning / neuromorphic algorithms and integrated power management. He was a recipient of IBM Outstanding Technical Achievement Award (2012), NSF CAREER Award (2017), and Intel Outstanding Researcher Award (2021).

==========================

Watch on YouTube:
Jae-sun Seo

Download presentation slides:
Jae-sun Seo

Feel free to ask your questions on this thread and keep the conversation going!