Boosting Learning Algorithm for Pattern Recognition and Beyond

Osamu KOMORI  Shinto EGUCHI  

Publication
IEICE TRANSACTIONS on Information and Systems   Vol.E94-D   No.10   pp.1863-1869
Publication Date: 2011/10/01
Online ISSN: 1745-1361
DOI: 10.1587/transinf.E94.D.1863
Print ISSN: 0916-8532
Type of Manuscript: INVITED PAPER (Special Section on Information-Based Induction Sciences and Machine Learning)
Category: 
Keyword: 
AUC; boosting; entropy; divergence; ROC; U-loss function; density estimation,  

Full Text: FreePDF(153.4KB)


Summary: 
This paper discusses recent developments for pattern recognition focusing on boosting approach in machine learning. The statistical properties such as Bayes risk consistency for several loss functions are discussed in a probabilistic framework. There are a number of loss functions proposed for different purposes and targets. A unified derivation is given by a generator function U which naturally defines entropy, divergence and loss function. The class of U-loss functions associates with the boosting learning algorithms for the loss minimization, which includes AdaBoost and LogitBoost as a twin generated from Kullback-Leibler divergence, and the (partial) area under the ROC curve. We expand boosting to unsupervised learning, typically density estimation employing U-loss function. Finally, a future perspective in machine learning is discussed.