
For FullText PDF, please login, if you are a member of IEICE,
or go to Pay Per View on menu list, if you are a nonmember of IEICE.

Fast Local Algorithms for Large Scale Nonnegative Matrix and Tensor Factorizations
Andrzej CICHOCKI AnhHuy PHAN
Publication
IEICE TRANSACTIONS on Fundamentals of Electronics, Communications and Computer Sciences
Vol.E92A
No.3
pp.708721 Publication Date: 2009/03/01
Online ISSN: 17451337
DOI: 10.1587/transfun.E92.A.708
Print ISSN: 09168508 Type of Manuscript: INVITED PAPER (Special Section on Latest Advances in Fundamental Theories of Signal Processing) Category: Keyword: nonnegative matrix factorization (NMF), nonnegative tensor factorizations (NTF), nonnegative PARAFAC, model reduction, feature extraction, compression, denoising, multiplicative local learning (adaptive) algorithms, alpha and beta divergences,
Full Text: PDF(2MB) >>Buy this Article
Summary:
Nonnegative matrix factorization (NMF) and its extensions such as Nonnegative Tensor Factorization (NTF) have become prominent techniques for blind sources separation (BSS), analysis of image databases, data mining and other information retrieval and clustering applications. In this paper we propose a family of efficient algorithms for NMF/NTF, as well as sparse nonnegative coding and representation, that has many potential applications in computational neuroscience, multisensory processing, compressed sensing and multidimensional data analysis. We have developed a class of optimized local algorithms which are referred to as Hierarchical Alternating Least Squares (HALS) algorithms. For these purposes, we have performed sequential constrained minimization on a set of squared Euclidean distances. We then extend this approach to robust cost functions using the alpha and beta divergences and derive flexible update rules. Our algorithms are locally stable and work well for NMFbased blind source separation (BSS) not only for the overdetermined case but also for an underdetermined (overcomplete) case (i.e., for a system which has less sensors than sources) if data are sufficiently sparse. The NMF learning rules are extended and generalized for Nth order nonnegative tensor factorization (NTF). Moreover, these algorithms can be tuned to different noise statistics by adjusting a single parameter. Extensive experimental results confirm the accuracy and computational performance of the developed algorithms, especially, with usage of multilayer hierarchical NMF approach [3].

