CASIA OpenIR  > 模式识别国家重点实验室  > 多媒体计算与图形学
Robust Visual Tracking via Exclusive Context Modeling
Zhang, Tianzhu1,2; Ghanem, Bernard1,3; Liu, Si4; Xu, Changsheng2; Ahuja, Narendra5
Source PublicationIEEE TRANSACTIONS ON CYBERNETICS
2016
Volume46Issue:1Pages:51-63
SubtypeArticle
AbstractIn this paper, we formulate particle filter-based object tracking as an exclusive sparse learning problem that exploits contextual information. To achieve this goal, we propose the context-aware exclusive sparse tracker (CEST) to model particle appearances as linear combinations of dictionary templates that are updated dynamically. Learning the representation of each particle is formulated as an exclusive sparse representation problem, where the overall dictionary is composed of multiple group dictionaries that can contain contextual information. With context, CEST is less prone to tracker drift. Interestingly, we show that the popular L-1 tracker [1] is a special case of our CEST formulation. The proposed learning problem is efficiently solved using an accelerated proximal gradient method that yields a sequence of closed form updates. To make the tracker much faster, we reduce the number of learning problems to be solved by using the dual problem to quickly and systematically rank and prune particles in each frame. We test our CEST tracker on challenging benchmark sequences that involve heavy occlusion, drastic illumination changes, and large pose variations. Experimental results show that CEST consistently outperforms state-of-the-art trackers.
KeywordContextual Information Exclusive Sparse Learning Particle Filter Tracking
WOS HeadingsScience & Technology ; Technology
DOI10.1109/TCYB.2015.2393307
WOS KeywordOBJECT TRACKING ; OCCLUSION DETECTION
Indexed BySCI
Language英语
Funding OrganizationAdvanced Digital Sciences Center, Singapore's Agency for Science, Technology and Research, under a Research Grant for the Human Sixth Sense Programme ; National Program on Key Basic Research Project (973 Program)(2012CB316304) ; National Natural Science Foundation of China(61225009)
WOS Research AreaComputer Science
WOS SubjectComputer Science, Artificial Intelligence ; Computer Science, Cybernetics
WOS IDWOS:000367144300006
Citation statistics
Cited Times:34[WOS]   [WOS Record]     [Related Records in WOS]
Document Type期刊论文
Identifierhttp://ir.ia.ac.cn/handle/173211/10648
Collection模式识别国家重点实验室_多媒体计算与图形学
Affiliation1.Adv Digital Sci Ctr, Singapore 138632, Singapore
2.Chinese Acad Sci, Inst Automat, Natl Lab Pattern Recognit, Beijing 100190, Peoples R China
3.King Abdullah Univ Sci & Technol, Thuwal 239556900, Saudi Arabia
4.Chinese Acad Sci, Inst Informat Engn, Beijing 100190, Peoples R China
5.Univ Illinois, Beckman Inst, Dept Elect & Comp Engn, Coordinated Sci Lab, Urbana, IL 61801 USA
Recommended Citation
GB/T 7714
Zhang, Tianzhu,Ghanem, Bernard,Liu, Si,et al. Robust Visual Tracking via Exclusive Context Modeling[J]. IEEE TRANSACTIONS ON CYBERNETICS,2016,46(1):51-63.
APA Zhang, Tianzhu,Ghanem, Bernard,Liu, Si,Xu, Changsheng,&Ahuja, Narendra.(2016).Robust Visual Tracking via Exclusive Context Modeling.IEEE TRANSACTIONS ON CYBERNETICS,46(1),51-63.
MLA Zhang, Tianzhu,et al."Robust Visual Tracking via Exclusive Context Modeling".IEEE TRANSACTIONS ON CYBERNETICS 46.1(2016):51-63.
Files in This Item: Download All
File Name/Size DocType Version Access License
TCYB15_Exclusive Con(1451KB)期刊论文作者接受稿开放获取CC BY-NC-SAView Download
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Zhang, Tianzhu]'s Articles
[Ghanem, Bernard]'s Articles
[Liu, Si]'s Articles
Baidu academic
Similar articles in Baidu academic
[Zhang, Tianzhu]'s Articles
[Ghanem, Bernard]'s Articles
[Liu, Si]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Zhang, Tianzhu]'s Articles
[Ghanem, Bernard]'s Articles
[Liu, Si]'s Articles
Terms of Use
No data!
Social Bookmark/Share
File name: TCYB15_Exclusive Context Modeling.pdf
Format: Adobe PDF
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.