Shinya Takamaeda
Hokkaido University, Japan
Model/Architecture Co-design for Accurate Binary Neural Network
Abstract
Energy-efficient deep neural network (DNN) is an important component for intelligent applications in embedded computing. Binary Neural Network (BNN) is a hardware-friendly approach for improving energy-efficiency and reducing the hardware area, but there is a critical drawback in point of the accuracy. We present a model/architecture co-design approach for improving the accuracy of BNN at low hardware cost. We introduce "Dither" for the activation function in a neural network for leveraging the high-precision intermediate information inside hardware processing units. Dither activation function enables to distribute quantization errors to neighbor pixels, so that information loss between adjacent layers is lowered. In this talk, we present the basic idea of Dither NN and its evaluation results.
Biography
Shinya Takamaeda-Yamazaki received the B.E, M.E, and D.E degrees from Tokyo Institute of Technology, Japan in 2009, 2011, and 2014 respectively. From 2011 to 2014, he was a JSPS research fellow (DC1). From 2014 to 2016, he was an assistant professor of Nara Institute of Science and Technology, Japan. Since 2016, he has been an associate professor of Hokkaido University, Japan. Since 2018, he has been a researcher of JST PRESTO. His research interests include computer architecture, high level synthesis, and machine learning. He is a member of IEEE, IEICE, and IPSJ.