CASIA OpenIR  > 类脑智能研究中心
Enhancing Person Re-identification by Robust Structural Metric Learning
Gang Yuan; Zhaoxiang Zhang; Yunhong Wang
2013-07-26
会议名称International Conference on Image and Graphics
会议录名称ICIG 2013
会议日期26-28 July 2013
会议地点Qingdao, China
摘要Person re-identification has become an important but also challenging task for video surveillance systems as it aims to match people across non-overlapping camera views. So far, most successful methods either focus on robust feature representation or sophisticated learners. Recently, metric learning has been applied in this task which aims to find a suitable feature subspace for matching samples from different cameras. However, most metric learning approaches rely on either pair wise or triplet-based distance comparison, which can be easily over-fitting in large scale and high dimension learning situation. Meanwhile, the performance of these methods can significantly decrease when the extracted features contain noisy information. In this paper, we propose a robust structural metric learning model for person re-identification with two main advantages: 1) it applies loss functions at the level of rankings rather than pair wise distances, 2) the proposed model is also robust to noisy information of the extracted features. The approach is verified on two available public datasets, and experimental results show that our method can get state-of-the-art performance.
关键词Robust Person Re-identification Structural Metric Learning Input Sparsity
文献类型会议论文
条目标识符http://ir.ia.ac.cn/handle/173211/13288
专题类脑智能研究中心
通讯作者Zhaoxiang Zhang
推荐引用方式
GB/T 7714
Gang Yuan,Zhaoxiang Zhang,Yunhong Wang. Enhancing Person Re-identification by Robust Structural Metric Learning[C],2013.
条目包含的文件
条目无相关文件。
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Gang Yuan]的文章
[Zhaoxiang Zhang]的文章
[Yunhong Wang]的文章
百度学术
百度学术中相似的文章
[Gang Yuan]的文章
[Zhaoxiang Zhang]的文章
[Yunhong Wang]的文章
必应学术
必应学术中相似的文章
[Gang Yuan]的文章
[Zhaoxiang Zhang]的文章
[Yunhong Wang]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。