Fast Inference of Binarized Convolutional Neural Networks Exploiting Max Pooling with Modified Block Structure

Ji-Hoon SHIN  Tae-Hwan KIM  

Publication
IEICE TRANSACTIONS on Information and Systems   Vol.E103-D   No.3   pp.706-710
Publication Date: 2020/03/01
Online ISSN: 1745-1361
DOI: 10.1587/transinf.2019EDL8165
Type of Manuscript: LETTER
Category: Software System
Keyword: 
binarized neural networks,  embedded systems,  convolutional neural networks,  inference,  deep learning,  

Full Text: PDF(325.6KB)>>
Buy this Article




Summary: 
This letter presents a novel technique to achieve a fast inference of the binarized convolutional neural networks (BCNN). The proposed technique modifies the structure of the constituent blocks of the BCNN model so that the input elements for the max-pooling operation are binary. In this structure, if any of the input elements is +1, the result of the pooling can be produced immediately; the proposed technique eliminates such computations that are involved to obtain the remaining input elements, so as to reduce the inference time effectively. The proposed technique reduces the inference time by up to 34.11%, while maintaining the classification accuracy.