Backpropagation Algorithm for LOGic Oriented Neural Networks with Quantized Weights and Multilevel Threshold Neurons

Takeshi KAMIO  Hisato FUJISAKA  Mititada MORISUE  

Publication
IEICE TRANSACTIONS on Fundamentals of Electronics, Communications and Computer Sciences   Vol.E84-A   No.3   pp.705-712
Publication Date: 2001/03/01
Online ISSN: 
DOI: 
Print ISSN: 0916-8508
Type of Manuscript: Special Section PAPER (Special Section of Selected Papers from the 13th Workshop on Circuits and Systems in Karuizawa)
Category: 
Keyword: 
LOGO neural networks,  backpropagation,  quantized weights,  multilevel threshold neurons,  decision level,  

Full Text: PDF(436.5KB)>>
Buy this Article




Summary: 
Multilayer feedforward neural network (MFNN) trained by the backpropagation (BP) algorithm is one of the most significant models in artificial neural networks. MFNNs have been used in many areas of signal and image processing due to high applicability. Although they have been implemented as analog, mixed analog-digital and fully digital VLSI circuits, it is still difficult to realize their hardware implementation with the BP learning function efficiently. This paper describes a special BP algorithm for the logic oriented neural network (LOGO-NN) which we have proposed as a sort of MFNN with quantized weights and multilevel threshold neurons. Both weights and neuron outputs are quantized to integer values in LOGO-NNs. Furthermore, the proposed BP algorithm can reduce high precise calculations. Therefore, it is expected that LOGO-NNs with BP learning can be more effectively implemented as digital type circuits than the common MFNNs with the classical BP. Finally, it is shown by simulations that the proposed BP algorithm for LOGO-NNs has good performance in terms of the convergence rate, convergence speed and generalization capability.