Spatial-Temporal Saliency Feature Extraction for Robust Mean-Shift Tracker
Suiwu Zheng; Linshan Liu; Hong Qiao
2014
Conference NameNeural Information Processing. 21st International Conference, ICONIP 2014
Source PublicationNeural Information Processing. 21st International Conference, ICONIP 2014. Proceedings: LNCS 8834
Conference Date3-6 Nov. 2014
Conference PlaceKuching, Malaysia
AbstractRobust object tracking in crowded and cluttered dynamic scenes is a very difficult task in robotic vision due to complex and changeable environment and similar features between the background and foreground. In this paper, a saliency feature extraction method is fused into mean-shift tracker to overcome above difficulties. First, a spatial-temporal saliency feature extraction method is proposed to suppress the interference of the complex background. Furthermore, we proposed a saliency evaluation method by fusing the top-down visual mechanism to enhance the tracking performance. Finally, the efficiency of the saliency features based mean-shift tracker is validated through experimental results and analysis.
KeywordNone
Document Type会议论文
Identifierhttp://ir.ia.ac.cn/handle/173211/12864
Collection复杂系统管理与控制国家重点实验室_机器人理论与应用
Corresponding AuthorSuiwu Zheng
Recommended Citation
GB/T 7714
Suiwu Zheng,Linshan Liu,Hong Qiao. Spatial-Temporal Saliency Feature Extraction for Robust Mean-Shift Tracker[C],2014.
Files in This Item:
There are no files associated with this item.
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Suiwu Zheng]'s Articles
[Linshan Liu]'s Articles
[Hong Qiao]'s Articles
Baidu academic
Similar articles in Baidu academic
[Suiwu Zheng]'s Articles
[Linshan Liu]'s Articles
[Hong Qiao]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Suiwu Zheng]'s Articles
[Linshan Liu]'s Articles
[Hong Qiao]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.