|
For Full-Text PDF, please login, if you are a member of IEICE,
or go to Pay Per View on menu list, if you are a nonmember of IEICE.
|
On Kernel Parameter Selection in Hilbert-Schmidt Independence Criterion
Masashi SUGIYAMA Makoto YAMADA
Publication
IEICE TRANSACTIONS on Information and Systems
Vol.E95-D
No.10
pp.2564-2567 Publication Date: 2012/10/01 Online ISSN: 1745-1361
DOI: 10.1587/transinf.E95.D.2564 Print ISSN: 0916-8532 Type of Manuscript: LETTER Category: Artificial Intelligence, Data Mining Keyword: Hilbert-Schmidt independence criterion, least-squares mutual information, cross-validation, Gaussian kernel,
Full Text: PDF(127.7KB)>>
Summary:
The Hilbert-Schmidt independence criterion (HSIC) is a kernel-based statistical independence measure that can be computed very efficiently. However, it requires us to determine the kernel parameters heuristically because no objective model selection method is available. Least-squares mutual information (LSMI) is another statistical independence measure that is based on direct density-ratio estimation. Although LSMI is computationally more expensive than HSIC, LSMI is equipped with cross-validation, and thus the kernel parameter can be determined objectively. In this paper, we show that HSIC can actually be regarded as an approximation to LSMI, which allows us to utilize cross-validation of LSMI for determining kernel parameters in HSIC. Consequently, both computational efficiency and cross-validation can be achieved.
|
open access publishing via
|
 |
 |
 |
 |
 |
|
|