For Full-Text PDF, please login, if you are a member of IEICE,|
or go to Pay Per View on menu list, if you are a nonmember of IEICE.
Speeding up Deep Neural Networks in Speech Recognition with Piecewise Quantized Sigmoidal Activation Function
Anhao XING Qingwei ZHAO Yonghong YAN
IEICE TRANSACTIONS on Information and Systems
Publication Date: 2016/10/01
Online ISSN: 1745-1361
Type of Manuscript: Special Section LETTER (Special Section on Recent Advances in Machine Learning for Spoken Language Processing)
Category: Acoustic modeling
deep neural networks, speech recognition, activation function, fixed-point quantization,
Full Text: PDF(151.4KB)>>
This paper proposes a new quantization framework on activation function of deep neural networks (DNN). We implement fixed-point DNN by quantizing the activations into powers-of-two integers. The costly multiplication operations in using DNN can be replaced with low-cost bit-shifts to massively save computations. Thus, applying DNN-based speech recognition on embedded systems becomes much easier. Experiments show that the proposed method leads to no performance degradation.