Principal Component Analysis by Homogeneous Neural Networks, Part : The Weighted Subspace Criterion

Erkki OJA  Hidemitsu OGAWA  Jaroonsakdi WANGVIWATTANA  

Publication
IEICE TRANSACTIONS on Information and Systems   Vol.E75-D   No.3   pp.366-375
Publication Date: 1992/05/25
Online ISSN: 
DOI: 
Print ISSN: 0916-8532
Type of Manuscript: PAPER
Category: Bio-Cybernetics
Keyword: 
feature extraction,  data compression,  neural networks,  learning algorithms,  

Full Text: PDF(765.3KB)>>
Buy this Article




Summary: 
Principal Component Analysis (PCA) is a useful technique in feature extraction and data compression. It can be formulated as a statistical constrained maximization problem, whose solution is given by unit eigenvectors of the data covariance matrix. In a practical application like image compression, the problem can be solved numerically by a corresponding gradient ascent maximization algorithm. Such on-line algoritms can be good alternatives due to their parallelism and adaptivity to input data. The algorithms can be implemented in a local and homogeneous way in learning neural networks. One example is the Subspace Network. It is a regular layer of parallel artificial neurons with a learning rule that is completely homogeneous with respect to the neurons. However, due to the complete homogeneity, the learning rule does not converge to the unique basis given by the dominant eigenvectors, but any basis of this eigenvector subspace is possible. In many applications like data compression, the subspace is not sufficient but the actual eigenvectors or PCA coefficient vectors are needed. A new criterion, called the Weighted Subspace Criterion, is proposed, which makes a small symmetry-breaking change to the Subspace Criterion. Only the true eigenvectors are solutions. Making the corresponding change to the learning rule of the Subspace Network gives a modified learning rule, which can be still implemented on a homogeneous network architecture. In learning, the weight vectors will tend to the true eigenvectors.