For Full-Text PDF, please login, if you are a member of IEICE,|
or go to Pay Per View on menu list, if you are a nonmember of IEICE.
Real-Time Generic Object Tracking via Recurrent Regression Network
Rui CHEN Ying TONG Ruiyu LIANG
IEICE TRANSACTIONS on Information and Systems
Publication Date: 2020/03/01
Online ISSN: 1745-1361
Type of Manuscript: PAPER
Category: Artificial Intelligence, Data Mining
object tracking, deep learning, convolutional neural network, features comparison, recurrent neural network,
Full Text: PDF(1.5MB)>>
Deep neural networks have achieved great success in visual tracking by learning a generic representation and leveraging large amounts of training data to improve performance. Most generic object trackers are trained from scratch online and do not benefit from a large number of videos available for offline training. We present a real-time generic object tracker capable of incorporating temporal information into its model, learning from many examples offline and quickly updating online. During the training process, the pre-trained weight of convolution layer is updated lagging behind, and the input video sequence length is gradually increased for fast convergence. Furthermore, only the hidden states in recurrent network are updated to guarantee the real-time tracking speed. The experimental results show that the proposed tracking method is capable of tracking objects at 150 fps with higher predicting overlap rate, and achieves more robustness in multiple benchmarks than state-of-the-art performance.