CASIA OpenIR  > 模式识别国家重点实验室  > 多媒体计算与图形学
Deep Relative Tracking
Gao, Junyu1,2; Zhang, Tianzhu1,2; Yang, Xiaoshan1,2; Xu, Changsheng1,2; Changsheng Xu
Source PublicationIEEE TRANSACTIONS ON IMAGE PROCESSING
2017-04-01
Volume26Issue:4Pages:1845-1858
SubtypeArticle
AbstractMost existing tracking methods are direct trackers, which directly exploit foreground or/and background information for object appearance modeling and decide whether an image patch is target object or not. As a result, these trackers cannot perform well when target appearance changes heavily and becomes different from its model. To deal with this issue, we propose a novel relative tracker, which can effectively exploit the relative relationship among image patches from both foreground and background for object appearance modeling. Different from direct trackers, the proposed relative tracker is robust to localize target object by use of the best image patch with the highest relative score to the target appearance model. To model relative relationship among large-scale image patch pairs, we propose a novel and effective deep relative learning algorithm through the convolutional neural network. We test the proposed approach on challenging sequences involving heavy occlusion, drastic illumination changes, and large pose variations. Experimental results show that our method consistently outperforms the state-of-theart trackers due to the powerful capacity of the proposed deep relative model.
KeywordVisual Tracking Deep Learning Relative Model
WOS HeadingsScience & Technology ; Technology
DOI10.1109/TIP.2017.2656628
WOS KeywordROBUST VISUAL TRACKING ; OBJECT TRACKING ; BENCHMARK
Indexed BySCI
Language英语
Funding OrganizationNational Natural Science Foundation of China(61225009 ; Importation and Development of High-Caliber Talents Project of Beijing Municipal Institutions(IDHT20140224) ; 61432019 ; 61572498 ; 61532009 ; 61572296)
WOS Research AreaComputer Science ; Engineering
WOS SubjectComputer Science, Artificial Intelligence ; Engineering, Electrical & Electronic
WOS IDWOS:000398976000005
Citation statistics
Document Type期刊论文
Identifierhttp://ir.ia.ac.cn/handle/173211/13645
Collection模式识别国家重点实验室_多媒体计算与图形学
Corresponding AuthorChangsheng Xu
Affiliation1.National Lab of Pattern Recognition, Institute of Automation, Chinese Academy of Sciences
2.University of Chinese Academy of Sciences
Recommended Citation
GB/T 7714
Gao, Junyu,Zhang, Tianzhu,Yang, Xiaoshan,et al. Deep Relative Tracking[J]. IEEE TRANSACTIONS ON IMAGE PROCESSING,2017,26(4):1845-1858.
APA Gao, Junyu,Zhang, Tianzhu,Yang, Xiaoshan,Xu, Changsheng,&Changsheng Xu.(2017).Deep Relative Tracking.IEEE TRANSACTIONS ON IMAGE PROCESSING,26(4),1845-1858.
MLA Gao, Junyu,et al."Deep Relative Tracking".IEEE TRANSACTIONS ON IMAGE PROCESSING 26.4(2017):1845-1858.
Files in This Item: Download All
File Name/Size DocType Version Access License
Deep Relative Tracki(5252KB)期刊论文作者接受稿开放获取CC BY-NC-SAView Download
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Gao, Junyu]'s Articles
[Zhang, Tianzhu]'s Articles
[Yang, Xiaoshan]'s Articles
Baidu academic
Similar articles in Baidu academic
[Gao, Junyu]'s Articles
[Zhang, Tianzhu]'s Articles
[Yang, Xiaoshan]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Gao, Junyu]'s Articles
[Zhang, Tianzhu]'s Articles
[Yang, Xiaoshan]'s Articles
Terms of Use
No data!
Social Bookmark/Share
File name: Deep Relative Tracking.pdf
Format: Adobe PDF
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.