
For FullText PDF, please login, if you are a member of IEICE,
or go to Pay Per View on menu list, if you are a nonmember of IEICE.

Fast Iterative Mining Using SparsityInducing Loss Functions
Hiroto SAIGO Hisashi KASHIMA Koji TSUDA
Publication
IEICE TRANSACTIONS on Information and Systems
Vol.E96D
No.8
pp.17661773 Publication Date: 2013/08/01
Online ISSN: 17451361
DOI: 10.1587/transinf.E96.D.1766
Print ISSN: 09168532 Type of Manuscript: PAPER Category: Pattern Recognition Keyword: discriminative pattern mining, sparsity, support vectors, classification, regression,
Full Text: PDF(636KB) >>Buy this Article
Summary:
Aprioribased mining algorithms enumerate frequent patterns efficiently, but the resulting large number of patterns makes it difficult to directly apply subsequent learning tasks. Recently, efficient iterative methods are proposed for mining discriminative patterns for classification and regression. These methods iteratively execute discriminative pattern mining algorithm and update example weights to emphasize on examples which received large errors in the previous iteration. In this paper, we study a family of loss functions that induces sparsity on example weights. Most of the resulting example weights become zeros, so we can eliminate those examples from discriminative pattern mining, leading to a significant decrease in search space and time. In computational experiments we compare and evaluate various loss functions in terms of the amount of sparsity induced and resulting speedup obtained.

