Analog VLSI Implementation of Adaptive Algorithms by an Extended Hebbian Synapse Circuit

Takashi MORIE  Osamu FUJITA  Yoshihito AMEMIYA  

Publication
IEICE TRANSACTIONS on Electronics   Vol.E75-C   No.3   pp.303-311
Publication Date: 1992/03/25
Online ISSN: 
DOI: 
Print ISSN: 0916-8516
Type of Manuscript: Special Section PAPER (Special Issue on Analog LSI and Related Technology)
Category: 
Keyword: 
neural network,  backpropagation,  Boltzmann machine,  analog LSI,  synapse,  Hebb learning,  

Full Text: PDF(777.9KB)>>
Buy this Article




Summary: 
First, a number of issues pertaining to analog VLSI implementation of Backpropagation (BP) and Deterministic Boltzmann Machine (DBM) learning algorithms are clarified. According to the results from software simulation, a mismatch between the activation function and derivative generated by independent circuits degrades the BP learning performance. The perfomance can be improved, however, by adjusting the gain of the activation function used to obtain the derivative, irrespective of the original activation function. Calculation errors embedded in the circuits also degrade the learning preformance. BP learning is sensitive to offset errors in multiplication in the learning process, and DBM learning is sensitive to asymmetry between the weight increment and decrement processes. Next, an analog VLSI architecture for implementing the algorithms using common building block circuits is proposed. The evaluation results of test chips confirm that synaptic weights can be updated up to 1 MHz and that a resolution exceeding 14 bits can be attained. The test chips successfully perform XOR learning using each algorithm.