Introduction of Orthonormal Transform into Neural Filter for Accelerating Convergence Speed

Isao NAKANISHI  Yoshio ITOH  Yutaka FUKUI  

IEICE TRANSACTIONS on Fundamentals of Electronics, Communications and Computer Sciences   Vol.E83-A   No.2   pp.367-370
Publication Date: 2000/02/25
Online ISSN: 
Print ISSN: 0916-8508
Type of Manuscript: Special Section LETTER (Special Section on Intelligent Signal and Image Processing)
transform domain neural filter,  neural filter,  multi-layer neural networks,  back-propagation algorithm,  normalized step size,  

Full Text: PDF>>
Buy this Article

As the nonlinear adaptive filter, the neural filter is utilized to process the nonlinear signal and/or system. However, the neural filter requires large number of iterations for convergence. This letter presents a new structure of the multi-layer neural filter where the orthonormal transform is introduced into all inter-layers to accelerate the convergence speed. The proposed structure is called the transform domain neural filter (TDNF) for convenience. The weights are basically updated by the Back-Propagation (BP) algorithm but it must be modified since the error back-propagates through the orthogonal transform. Moreover, the variable step size which is normalized by the transformed signal power is introduced into the BP algorithm to realize the orthonormal transform. Through the computer simulation, it is confirmed that the introduction of the orthonormal transform is effective for speedup of convergence in the neural filter.