17th INTERNATIONAL FORUM ON MPSoC
for software-defined hardware
Speaker's Profile
Shinya Takamaeda
Hokkaido University, Japan
Energy-Efficient In-Memory Neural Network Processor
Download SlidesAbstract
Deep neural network (DNN) is now fundamental machine learning technology in wide range applications. In use of edge computing, low-power inference accelerators of DNN are required for installing enough intelligence into embedded devices. In this presentation, we present an energy-efficient deep neural network processor architecture and its LSI implementation. It employs ideas of the binary neural network and the in-memory processing architecture for improving the memory bandwidth and reducing the data movement energy. In this talk, we present the design of our in-memory DNN processor and its evaluation results.
Biography
Shinya Takamaeda received the B.E, M.E, and D.E degrees from Tokyo Institute of Technology, Japan in 2009, 2011, and 2014 respectively. From 2011 to 2014, he was a JSPS research fellow (DC1). From 2014 to 2016, he was an assistant professor of Nara Institute of Science and Technology, Japan. Since 2016, he has been an associate professor of Hokkaido University, Japan. His research interests include FPGA computing, high level synthesis, and machine learning. He is a member of IEEE, IEICE, and IPSJ.