A Genetic Algorithm Creates New Attractors in an Associative Memory Network by Pruning Synapses Adaptively

Akira IMADA  Keijiro ARAKI  

Publication
IEICE TRANSACTIONS on Information and Systems   Vol.E81-D   No.11   pp.1290-1297
Publication Date: 1998/11/25
Online ISSN: 
DOI: 
Print ISSN: 0916-8532
Type of Manuscript: PAPER
Category: Bio-Cybernetics and Neurocomputing
Keyword: 
genetic algorithm,  fully connected neural network model of associative memory,  storage capacity,   synaptic symmetry and dilution,  

Full Text: PDF>>
Buy this Article




Summary: 
We apply evolutionary algorithms to neural network model of associative memory. In the model, some of the appropriate configurations of the synaptic weights allow the network to store a number of patterns as an associative memory. For example, the so-called Hebbian rule prescribes one such configuration. However, if the number of patterns to be stored exceeds a critical amount (over-loaded), the ability to store patterns collapses more or less. Or, synaptic weights chosen at random do not have such an ability. In this paper, we describe a genetic algorithm which successfully evolves both the random synapses and over-loaded Hebbian synapses to function as associative memory by adaptively pruning some of the synaptic connections. Although many authors have shown that the model is robust against pruning a fraction of synaptic connections, improvement of performance by pruning has not been explored, as far as we know.