Low-Complexity Training for Binary Convolutional Neural Networks Based on Clipping-Aware Weight Update

Changho RYU
Tae-Hwan KIM

IEICE TRANSACTIONS on Information and Systems   Vol.E104-D    No.6    pp.919-922
Publication Date: 2021/06/01
Publicized: 2021/03/17
Online ISSN: 1745-1361
DOI: 10.1587/transinf.2020EDL8143
Type of Manuscript: LETTER
Category: Biocybernetics, Neurocomputing
binarized neural networks,  low complexity,  quantization-aware training,  convolutional neural networks,  

Full Text: PDF>>
Buy this Article

This letter presents an efficient technique to reduce the computational complexity involved in training binary convolutional neural networks (BCNN). The BCNN training shall be conducted focusing on the optimization of the sign of each weight element rather than the exact value itself in convention; in which, the sign of an element is not likely to be flipped anymore after it has been updated to have such a large magnitude to be clipped out. The proposed technique does not update such elements that have been clipped out and eliminates the computations involved in their optimization accordingly. The complexity reduction by the proposed technique is as high as 25.52% in training the BCNN model for the CIFAR-10 classification task, while the accuracy is maintained without severe degradation.