Hybrid Parallel Inference for Hierarchical Dirichlet Processes

Tsukasa OMOTO  Koji EGUCHI  Shotaro TORA  

IEICE TRANSACTIONS on Information and Systems   Vol.E97-D   No.4   pp.815-820
Publication Date: 2014/04/01
Online ISSN: 1745-1361
DOI: 10.1587/transinf.E97.D.815
Type of Manuscript: Special Section LETTER (Special Section on Data Engineering and Information Management)
hierarchical dirichlet process,  topic models,  parallelization,  

Full Text: FreePDF(1.4MB)

The hierarchical Dirichlet process (HDP) can provide a nonparametric prior for a mixture model with grouped data, where mixture components are shared across groups. However, the computational cost is generally very high in terms of both time and space complexity. Therefore, developing a method for fast inference of HDP remains a challenge. In this paper, we assume a symmetric multiprocessing (SMP) cluster, which has been widely used in recent years. To speed up the inference on an SMP cluster, we explore hybrid two-level parallelization of the Chinese restaurant franchise sampling scheme for HDP, especially focusing on the application to topic modeling. The methods we developed, Hybrid-AD-HDP and Hybrid-Diff-AD-HDP, make better use of SMP clusters, resulting in faster HDP inference. While the conventional parallel algorithms with a full message-passing interface does not benefit from using SMP clusters due to higher communication costs, the proposed hybrid parallel algorithms have lower communication costs and make better use of the computational resources.