Slides available here!


Speaker:

Sungjoo Yoo, Seoul National University, Korea

Title:

Fast and Low Power Deep Convolutional Neural Networks for Mobile Applications

Abstract:

Although the latest high-end smartphone has powerful CPU and GPU, running deep convolutional neural networks (CNNs) for complex tasks such as ImageNet classification on mobile devices is challenging. To deploy deep CNNs on mobile devices, we present a simple and effective scheme to compress the entire CNN, which we call one-shot whole network compression. The proposed scheme consists of three steps: (1) rank selection with variational Bayesian matrix factorization, (2) Tucker decomposition on kernel tensor, and (3) fine-tuning to recover accumulated loss of accuracy, and each step can be easily implemented using publicly available tools. We demonstrate the effectiveness of the proposed scheme by testing the performance of various compressed CNNs (AlexNet, VGGS, GoogLeNet, and VGG-16) on the smartphone. Significant reductions in model size, runtime, and energy consumption are obtained, at the cost of small loss in accuracy.

Bio:

Sungjoo Yoo received Ph.D. from Seoul National University in 2000. He worked as researcher at TIMA laboratory, Grenoble France from 2000 to 2004. He was principal engineer at Samsung System LSI from 2004 to 2008. He was at POSTECH from 2008 to 2015. He joined Seoul National University in 2015 and is now associate professor. His current research interests include low power deep neural networks and near data processing (processing-in-memory and in-storage processing).



* If you wish to modify any information or update your photo, please contact the Publicity Chair at the following address.