Theoretical Analyses on 2-Norm-Based Multiple Kernel Regressors

Akira TANAKA  Hideyuki IMAI  

Publication
IEICE TRANSACTIONS on Fundamentals of Electronics, Communications and Computer Sciences   Vol.E100-A   No.3   pp.877-887
Publication Date: 2017/03/01
Online ISSN: 1745-1337
Type of Manuscript: PAPER
Category: Neural Networks and Bioengineering
Keyword: 
multiple kernel regressor,  reproducing kernel Hilbert space,  generalization error,  2-norm criterion,  2-norm regularizer,  

Full Text: PDF(382.1KB)
>>Buy this Article


Summary: 
The solution of the standard 2-norm-based multiple kernel regression problem and the theoretical limit of the considered model space are discussed in this paper. We prove that 1) The solution of the 2-norm-based multiple kernel regressor constructed by a given training data set does not generally attain the theoretical limit of the considered model space in terms of the generalization errors, even if the training data set is noise-free, 2) The solution of the 2-norm-based multiple kernel regressor is identical to the solution of the single kernel regressor under a noise free setting, in which the adopted single kernel is the sum of the same kernels used in the multiple kernel regressor; and it is also true for a noisy setting with the 2-norm-based regularizer. The first result motivates us to develop a novel framework for the multiple kernel regression problems which yields a better solution close to the theoretical limit, and the second result implies that it is enough to use the single kernel regressors with the sum of given multiple kernels instead of the multiple kernel regressors as long as the 2-norm based criterion is used.