For Full-Text PDF, please login, if you are a member of IEICE,|
or go to Pay Per View on menu list, if you are a nonmember of IEICE.
A Learning Algorithm with Activation Function Manipulation for Fault Tolerant Neural Networks
Naotake KAMIURA Yasuyuki TANIGUCHI Yutaka HATA Nobuyuki MATSUI
IEICE TRANSACTIONS on Information and Systems
Publication Date: 2001/07/01
Print ISSN: 0916-8532
Type of Manuscript: PAPER
Category: Fault Tolerance
feedforward neural network, backpropagation algorithm, stuck-at fault, sigmoid activation function,
Full Text: PDF>>
In this paper we propose a learning algorithm to enhance the fault tolerance of feedforward neural networks (NNs for short) by manipulating the gradient of sigmoid activation function of the neuron. We assume stuck-at-0 and stuck-at-1 faults of the connection link. For the output layer, we employ the function with the relatively gentle gradient to enhance its fault tolerance. For enhancing the fault tolerance of hidden layer, we steepen the gradient of function after convergence. The experimental results for a character recognition problem show that our NN is superior in fault tolerance, learning cycles and learning time to other NNs trained with the algorithms employing fault injection, forcible weight limit and the calculation of relevance of each weight to the output error. Besides the gradient manipulation incorporated in our algorithm never spoils the generalization ability.