
For FullText PDF, please login, if you are a member of IEICE,
or go to Pay Per View on menu list, if you are a nonmember of IEICE.

Numerical Evaluation of Incremental Vector Quantization Using Stochastic Relaxation
Noritaka SHIGEI Hiromi MIYAJIMA Michiharu MAEDA
Publication
IEICE TRANSACTIONS on Fundamentals of Electronics, Communications and Computer Sciences
Vol.E87A
No.9
pp.23642371 Publication Date: 2004/09/01 Online ISSN:
DOI: Print ISSN: 09168508 Type of Manuscript: Special Section PAPER (Special Section on Nonlinear Theory and its Applications) Category: Keyword: vector quantization, stochastic relaxation, incremental learning, Kmeans, neuralgas,
Full Text: PDF(269.3KB)>>
Summary:
Learning algorithms for Vector Quantization (VQ) are categorized into two types: batch learning and incremental learning. Incremental learning is more useful than batch learning, because, unlike batch learning, incremental learning can be performed either online or offline. In this paper, we develop effective incremental learning methods by using Stochastic Relaxation (SR) techniques, which have been developed for batch learning. It has been shown that, for batch learning, the SR techniques can provide good global optimization without greatly increasing the computational cost. We empirically investigates the effective implementation of SR for incremental learning. Specifically, we consider five types of SR methods: ISR1, ISR2, ISR3, WSR1 and WSR2. ISRs and WSRs add noise input and weight vectors, respectively. The difference among them is when the perturbed input or weight vectors are used in learning. These SR methods are applied to three types of incremental learning: Kmeans, NeuralGas (NG) and Kohonen's SelfOrganizing Mapping (SOM). We evaluate comprehensively these combinations in terms of accuracy and computation time. Our simulation results show that Kmeans with ISR3 is the most comprehensively effective among these combinations and is superior to the conventional NG method known as an excellent method.

