A Recognition Method for One-Stroke Finger Gestures Using a MEMS 3D Accelerometer

Lei JING  Yinghui ZHOU  Zixue CHENG  Junbo WANG  

IEICE TRANSACTIONS on Information and Systems   Vol.E94-D   No.5   pp.1062-1072
Publication Date: 2011/05/01
Online ISSN: 1745-1361
DOI: 10.1587/transinf.E94.D.1062
Print ISSN: 0916-8532
Type of Manuscript: PAPER
Category: Rehabilitation Engineering and Assistive Technology
ubiquitous computing,  wearable computing,  gesture recognition,  finger gesture recognition,  accelerometer,  homecare,  

Full Text: PDF>>
Buy this Article

Automatic recognition of finger gestures can be used for promotion of life quality. For example, a senior citizen can control the home appliance, call for help in emergency, or even communicate with others through simple finger gestures. Here, we focus on one-stroke finger gesture, which are intuitive to be remembered and performed. In this paper, we proposed and evaluated an accelerometer-based method for detecting the predefined one-stroke finger gestures from the data collected using a MEMS 3D accelerometer worn on the index finger. As alternative to the optoelectronic, sonic and ultrasonic approaches, the accelerometer-based method is featured as self-contained, cost-effective, and can be used in noisy or private space. A compact wireless sensing mote integrated with the accelerometer, called MagicRing, is developed to be worn on the finger for real data collection. A general definition on one-stroke gesture is given out, and 12 kinds of one-stroke finger gestures are selected from human daily activities. A set of features is extracted among the candidate feature set including both traditional features like standard deviation, energy, entropy, and frequency of acceleration and a new type of feature called relative feature. Both subject-independent and subject-dependent experiment methods were evaluated on three kinds of representative classifiers. In the subject-independent experiment among 20 subjects, the decision tree classifier shows the best performance recognizing the finger gestures with an average accuracy rate for 86.92 %. In the subject-dependent experiment, the nearest neighbor classifier got the highest accuracy rate for 97.55 %.