|
For Full-Text PDF, please login, if you are a member of IEICE,
or go to Pay Per View on menu list, if you are a nonmember of IEICE.
|
Content-Based Retrieval of Motion Capture Data Using Short-Term Feature Extraction
Jianfeng XU Haruhisa KATO Akio YONEYAMA
Publication
IEICE TRANSACTIONS on Information and Systems
Vol.E92-D
No.9
pp.1657-1667 Publication Date: 2009/09/01 Online ISSN: 1745-1361
DOI: 10.1587/transinf.E92.D.1657 Print ISSN: 0916-8532 Type of Manuscript: PAPER Category: Contents Technology and Web Information Systems Keyword: motion capture, content-based retrieval, short-term feature, distance metric, motion similarity, dynamic time warping,
Full Text: PDF(1MB)>>
Summary:
This paper presents a content-based retrieval algorithm for motion capture data, which is required to re-use a large-scale database that has many variations in the same category of motions. The most challenging problem is that logically similar motions may not be numerically similar due to the motion variations in a category. Our algorithm can effectively retrieve logically similar motions to a query, where a distance metric between our novel short-term features is defined properly as a fundamental component in our system. We extract the features based on short-term analysis of joint velocities after dividing an entire motion capture sequence into many small overlapped clips. In each clip, we select not only the magnitude but also the dynamic pattern of the joint velocities as our features, which can discard the motion variations while keeping the significant motion information in a category. Simultaneously, the amount of data is reduced, alleviating the computational cost. Using the extracted features, we define a novel distance metric between two motion clips. By dynamic time warping, a motion dissimilarity measure is calculated between two motion capture sequences. Then, given a query, we rank all the motions in our dataset according to their motion dissimilarity measures. Our experiments, which are performed on a test dataset consisting of more than 190 motions, demonstrate that our algorithm greatly improves the performance compared to two conventional methods according to a popular evaluation measure P(NR).
|
|