A Learning Algorithm of Boosting Kernel Discriminant Analysis for Pattern Recognition

Shinji KITA  Seiichi OZAWA  Satoshi MAEKAWA  Shigeo ABE  

Publication
IEICE TRANSACTIONS on Information and Systems   Vol.E90-D   No.11   pp.1853-1863
Publication Date: 2007/11/01
Online ISSN: 1745-1361
DOI: 10.1093/ietisy/e90-d.11.1853
Print ISSN: 0916-8532
Type of Manuscript: PAPER
Category: Biocybernetics, Neurocomputing
Keyword: 
boosting,  kernel methods,  kernel discriminant analysis,  pattern classification,  neural networks,  feature selection,  

Full Text: PDF>>
Buy this Article




Summary: 
In this paper, we present a new method to enhance classification performance of a multiple classifier system by combining a boosting technique called AdaBoost.M2 and Kernel Discriminant Analysis (KDA). To reduce the dependency between classifier outputs and to speed up the learning, each classifier is trained in a different feature space, which is obtained by applying KDA to a small set of hard-to-classify training samples. The training of the system is conducted based on AdaBoost.M2, and the classifiers are implemented by Radial Basis Function networks. To perform KDA at every boosting round in a realistic time scale, a new kernel selection method based on the class separability measure is proposed. Furthermore, a new criterion of the training convergence is also proposed to acquire good classification performance with fewer boosting rounds. To evaluate the proposed method, several experiments are carried out using standard evaluation datasets. The experimental results demonstrate that the proposed method can select an optimal kernel parameter more efficiently than the conventional cross-validation method, and that the training of boosting classifiers is terminated with a fairly small number of rounds to attain good classification accuracy. For multi-class classification problems, the proposed method outperforms both Boosting Linear Discriminant Analysis (BLDA) and Radial-Basis Function Network (RBFN) with regard to the classification accuracy. On the other hand, the performance evaluation for 2-class problems shows that the advantage of the proposed BKDA against BLDA and RBFN depends on the datasets.