Deep Relative Tracking
Gao, Junyu1,2; Zhang, Tianzhu1,2; Yang, Xiaoshan1,2; Xu, Changsheng1,2; Changsheng Xu
2017-04-01
发表期刊IEEE TRANSACTIONS ON IMAGE PROCESSING
卷号26期号:4页码:1845-1858
文章类型Article
摘要Most existing tracking methods are direct trackers, which directly exploit foreground or/and background information for object appearance modeling and decide whether an image patch is target object or not. As a result, these trackers cannot perform well when target appearance changes heavily and becomes different from its model. To deal with this issue, we propose a novel relative tracker, which can effectively exploit the relative relationship among image patches from both foreground and background for object appearance modeling. Different from direct trackers, the proposed relative tracker is robust to localize target object by use of the best image patch with the highest relative score to the target appearance model. To model relative relationship among large-scale image patch pairs, we propose a novel and effective deep relative learning algorithm through the convolutional neural network. We test the proposed approach on challenging sequences involving heavy occlusion, drastic illumination changes, and large pose variations. Experimental results show that our method consistently outperforms the state-of-theart trackers due to the powerful capacity of the proposed deep relative model.
关键词Visual Tracking Deep Learning Relative Model
WOS标题词Science & Technology ; Technology
DOI10.1109/TIP.2017.2656628
关键词[WOS]ROBUST VISUAL TRACKING ; OBJECT TRACKING ; BENCHMARK
收录类别SCI
语种英语
项目资助者National Natural Science Foundation of China(61225009 ; Importation and Development of High-Caliber Talents Project of Beijing Municipal Institutions(IDHT20140224) ; 61432019 ; 61572498 ; 61532009 ; 61572296)
WOS研究方向Computer Science ; Engineering
WOS类目Computer Science, Artificial Intelligence ; Engineering, Electrical & Electronic
WOS记录号WOS:000398976000005
引用统计
文献类型期刊论文
条目标识符http://ir.ia.ac.cn/handle/173211/13645
专题模式识别国家重点实验室_多媒体计算与图形学
通讯作者Changsheng Xu
作者单位1.National Lab of Pattern Recognition, Institute of Automation, Chinese Academy of Sciences
2.University of Chinese Academy of Sciences
推荐引用方式
GB/T 7714
Gao, Junyu,Zhang, Tianzhu,Yang, Xiaoshan,et al. Deep Relative Tracking[J]. IEEE TRANSACTIONS ON IMAGE PROCESSING,2017,26(4):1845-1858.
APA Gao, Junyu,Zhang, Tianzhu,Yang, Xiaoshan,Xu, Changsheng,&Changsheng Xu.(2017).Deep Relative Tracking.IEEE TRANSACTIONS ON IMAGE PROCESSING,26(4),1845-1858.
MLA Gao, Junyu,et al."Deep Relative Tracking".IEEE TRANSACTIONS ON IMAGE PROCESSING 26.4(2017):1845-1858.
条目包含的文件 下载所有文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
Deep Relative Tracki(5252KB)期刊论文作者接受稿开放获取CC BY-NC-SA浏览 下载
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Gao, Junyu]的文章
[Zhang, Tianzhu]的文章
[Yang, Xiaoshan]的文章
百度学术
百度学术中相似的文章
[Gao, Junyu]的文章
[Zhang, Tianzhu]的文章
[Yang, Xiaoshan]的文章
必应学术
必应学术中相似的文章
[Gao, Junyu]的文章
[Zhang, Tianzhu]的文章
[Yang, Xiaoshan]的文章
相关权益政策
暂无数据
收藏/分享
文件名: Deep Relative Tracking.pdf
格式: Adobe PDF
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。