GPGPU Implementation of Variational Bayesian Gaussian Mixture Models

Renyuan ZHANG

IEICE TRANSACTIONS on Information and Systems   Vol.E105-D    No.3    pp.611-622
Publication Date: 2022/03/01
Publicized: 2021/11/24
Online ISSN: 1745-1361
DOI: 10.1587/transinf.2021EDP7121
Type of Manuscript: PAPER
Category: Fundamentals of Information Systems
GPGPU,  Gaussian mixture model,  variational Bayes,  machine learning,  clustering,  

Full Text: PDF(1MB)>>
Buy this Article

The efficient implementation strategy for speeding up high-quality clustering algorithms is developed on the basis of general purpose graphic processing units (GPGPUs) in this work. Among various clustering algorithms, a sophisticated Gaussian mixture model (GMM) by estimating parameters through variational Bayesian (VB) mechanism is conducted due to its superior performances. Since the VB-GMM methodology is computation-hungry, the GPGPU is employed to carry out massive matrix-computations. To efficiently migrate the conventional CPU-oriented schemes of VB-GMM onto GPGPU platforms, an entire migration-flow with thirteen stages is presented in detail. The CPU-GPGPU co-operation scheme, execution re-order, and memory access optimization are proposed for optimizing the GPGPU utilization and maximizing the clustering speed. Five types of real-world applications along with relevant data-sets are introduced for the cross-validation. From the experimental results, the feasibility of implementing VB-GMM algorithm by GPGPU is verified with practical benefits. The proposed GPGPU migration achieves 192x speedup in maximum. Furthermore, it succeeded in identifying the proper number of clusters, which is hardly conducted by the EM-algotihm.

open access publishing via