Real-Time Uncharacteristic-Part Tracking with a Point Set

Norimichi UKITA  Akira MAKINO  Masatsugu KIDODE  

Publication
IEICE TRANSACTIONS on Information and Systems   Vol.E93-D   No.7   pp.1682-1689
Publication Date: 2010/07/01
Online ISSN: 1745-1361
Print ISSN: 0916-8532
Type of Manuscript: Special Section PAPER (Special Section on Machine Vision and its Applications)
Category: 
Keyword: 
real-time tracking,  zoom-in camera,  point-set tracking,  

Full Text: PDF(589KB)
>>Buy this Article


Summary: 
In this research, we focus on how to track a target region that lies next to similar regions (e.g. a forearm and an upper arm) in zoom-in images. Many previous tracking methods express the target region (i.e. a part in a human body) with a single model such as an ellipse, a rectangle, and a deformable closed region. With the single model, however, it is difficult to track the target region in zoom-in images without confusing it and its neighboring similar regions (e.g. "a forearm and an upper arm" and "a small region in a torso and its neighboring regions") because they might have the same texture patterns and do not have the detectable border between them. In our method, a group of feature points in a target region is extracted and tracked as the model of the target. Small differences between the neighboring regions can be verified by focusing only on the feature points. In addition, (1) the stability of tracking is improved using particle filtering and (2) tracking robust to occlusions is realized by removing unreliable points using random sampling. Experimental results demonstrate the effectiveness of our method even when occlusions occur.