
For FullText PDF, please login, if you are a member of IEICE,
or go to Pay Per View on menu list, if you are a nonmember of IEICE.

Computationally Efficient Estimation of SquaredLoss Mutual Information with Multiplicative Kernel Models
Tomoya SAKAI Masashi SUGIYAMA
Publication
IEICE TRANSACTIONS on Information and Systems
Vol.E97D
No.4
pp.968971 Publication Date: 2014/04/01 Online ISSN: 17451361
DOI: 10.1587/transinf.E97.D.968 Type of Manuscript: LETTER Category: Fundamentals of Information Systems Keyword: squaredloss mutual information, leastsquares mutual information, density ratio estimation, multiplicative kernel models, independence test,
Full Text: PDF>>
Summary:
Squaredloss mutual information (SMI) is a robust measure of the statistical dependence between random variables. The samplebased SMI approximator called leastsquares mutual information (LSMI) was demonstrated to be useful in performing various machine learning tasks such as dimension reduction, clustering, and causal inference. The original LSMI approximates the pointwise mutual information by using the kernel model, which is a linear combination of kernel basis functions located on paired data samples. Although LSMI was proved to achieve the optimal approximation accuracy asymptotically, its approximation capability is limited when the sample size is small due to an insufficient number of kernel basis functions. Increasing the number of kernel basis functions can mitigate this weakness, but a naive implementation of this idea significantly increases the computation costs. In this article, we show that the computational complexity of LSMI with the multiplicative kernel model, which locates kernel basis functions on unpaired data samples and thus the number of kernel basis functions is the sample size squared, is the same as that for the plain kernel model. We experimentally demonstrate that LSMI with the multiplicative kernel model is more accurate than that with plain kernel models in small sample cases, with only mild increase in computation time.

