For Full-Text PDF, please login, if you are a member of IEICE,|
or go to Pay Per View on menu list, if you are a nonmember of IEICE.
Introduction of Orthonormal Transform into Neural Filter for Accelerating Convergence Speed
Isao NAKANISHI Yoshio ITOH Yutaka FUKUI
IEICE TRANSACTIONS on Fundamentals of Electronics, Communications and Computer Sciences
Publication Date: 2000/02/25
Print ISSN: 0916-8508
Type of Manuscript: Special Section LETTER (Special Section on Intelligent Signal and Image Processing)
transform domain neural filter, neural filter, multi-layer neural networks, back-propagation algorithm, normalized step size,
Full Text: PDF(183.8KB)>>
As the nonlinear adaptive filter, the neural filter is utilized to process the nonlinear signal and/or system. However, the neural filter requires large number of iterations for convergence. This letter presents a new structure of the multi-layer neural filter where the orthonormal transform is introduced into all inter-layers to accelerate the convergence speed. The proposed structure is called the transform domain neural filter (TDNF) for convenience. The weights are basically updated by the Back-Propagation (BP) algorithm but it must be modified since the error back-propagates through the orthogonal transform. Moreover, the variable step size which is normalized by the transformed signal power is introduced into the BP algorithm to realize the orthonormal transform. Through the computer simulation, it is confirmed that the introduction of the orthonormal transform is effective for speedup of convergence in the neural filter.