Regularized Kernel Representation for Visual Tracking

Jun WANG  Yuanyun WANG  Chengzhi DENG  Shengqian WANG  Yong QIN  

IEICE TRANSACTIONS on Fundamentals of Electronics, Communications and Computer Sciences   Vol.E101-A   No.4   pp.668-677
Publication Date: 2018/04/01
Online ISSN: 1745-1337
DOI: 10.1587/transfun.E101.A.668
Type of Manuscript: PAPER
Category: Digital Signal Processing
visual tracking,  kernel representation,  l2-regularization,  nonlinear,  particle filter,  

Full Text: PDF(12.6MB)
>>Buy this Article

Developing a robust appearance model is a challenging task due to appearance variations of objects such as partial occlusion, illumination variation, rotation and background clutter. Existing tracking algorithms employ linear combinations of target templates to represent target appearances, which are not accurate enough to deal with appearance variations. The underlying relationship between target candidates and the target templates is highly nonlinear because of complicated appearance variations. To address this, this paper presents a regularized kernel representation for visual tracking. Namely, the feature vectors of target appearances are mapped into higher dimensional features, in which a target candidate is approximately represented by a nonlinear combination of target templates in a dimensional space. The kernel based appearance model takes advantage of considering the non-linear relationship and capturing the nonlinear similarity between target candidates and target templates. l2-regularization on coding coefficients makes the approximate solution of target representations more stable. Comprehensive experiments demonstrate the superior performances in comparison with state-of-the-art trackers.