Recognition of Alphabetical Hand Gestures Using Hidden Markov Model

Ho-Sub YOON  Jung SOH  Byung-Woo MIN  Hyun Seung YANG  

IEICE TRANSACTIONS on Fundamentals of Electronics, Communications and Computer Sciences   Vol.E82-A   No.7   pp.1358-1366
Publication Date: 1999/07/25
Online ISSN: 
Print ISSN: 0916-8508
Type of Manuscript: PAPER
Category: Neural Networks
gesture recognition,  hidden Markov model (HMM),  

Full Text: PDF>>
Buy this Article

The use of hand gesture provides an attractive alternative to cumbersome interface devices for human-computer interaction (HCI). In particular, visual interpretation of hand gestures can help achieve easy and natural comprehension for HCI. Many methods for hand gesture recognition using visual analysis have been proposed such as syntactical analysis, neural network (NN), and hidden Markov model (HMM)s. In our research, HMMs are proposed for alphabetical hand gesture recognition. In the preprocessing stage, the proposed approach consists of three different procedures for hand localization, hand tracking and gesture spotting. The hand location procedure detects the candidated regions on the basis of skin color and motion in an image by using a color histogram matching and time-varying edge difference techniques. The hand tracking algorithm finds the centroid of a moving hand region, connect those centroids, and produces a trajectory. The spotting algorithm divides the trajectory into real and meaningless gestures. In constructing a feature database, the proposed approach uses the weighted ρ-φ-ν feature code, and employ a k-means algorithm for the codebook of HMM. In our experiments, 1,300 alphabetical and 1,300 untrained gestures are used for training and testing, respectively. Those experimental results demonstrate that the proposed approach yields a higher and satisfactory recognition rate for the images with different sizes, shapes and skew angles.