Analytic Optimization of Shrinkage Parameters Based on Regularized Subspace Information Criterion

Masashi SUGIYAMA  Keisuke SAKURAI  

Publication
IEICE TRANSACTIONS on Fundamentals of Electronics, Communications and Computer Sciences   Vol.E89-A   No.8   pp.2216-2225
Publication Date: 2006/08/01
Online ISSN: 1745-1337
DOI: 10.1093/ietfec/e89-a.8.2216
Print ISSN: 0916-8508
Type of Manuscript: PAPER
Category: Neural Networks and Bioengineering
Keyword: 
supervised learning,  generalization capability,  model selection,  shrinkage estimator,  regularized subspace information criterion,  

Full Text: PDF>>
Buy this Article




Summary: 
For obtaining a higher level of generalization capability in supervised learning, model parameters should be optimized, i.e., they should be determined in such a way that the generalization error is minimized. However, since the generalization error is inaccessible in practice, model parameters are usually determined in such a way that an estimate of the generalization error is minimized. A standard procedure for model parameter optimization is to first prepare a finite set of candidates of model parameter values, estimate the generalization error for each candidate, and then choose the best one from the candidates. If the number of candidates is increased in this procedure, the optimization quality may be improved. However, this in turn increases the computational cost. In this paper, we give methods for analytically finding the optimal model parameter value from a set of infinitely many candidates. This maximally enhances the optimization quality while the computational cost is kept reasonable.