Constructing Kernel Functions for Binary Regression

Masashi SUGIYAMA  Hidemitsu OGAWA  

IEICE TRANSACTIONS on Information and Systems   Vol.E89-D    No.7    pp.2243-2249
Publication Date: 2006/07/01
Online ISSN: 1745-1361
DOI: 10.1093/ietisy/e89-d.7.2243
Print ISSN: 0916-8532
Type of Manuscript: PAPER
Category: Pattern Recognition
supervised learning,  regression,  kernel methods,  kernel functions,  Karhunen-Loeve expansion,  principal component analysis,  binary regression,  Gaussian kernel,  

Full Text: PDF(243.7KB)>>
Buy this Article

Kernel-based learning algorithms have been successfully applied in various problem domains, given appropriate kernel functions. In this paper, we discuss the problem of designing kernel functions for binary regression and show that using a bell-shaped cosine function as a kernel function is optimal in some sense. The rationale of this result is based on the Karhunen-Loeve expansion, i.e., the optimal approximation to a set of functions is given by the principal component of the correlation operator of the functions.

open access publishing via