CASIA OpenIR  > 类脑智能研究中心
Enhancing Human Pose Estimation with Temporal Clues
Jianliang Hao; Zhaoxiang Zhang; Yunhong Wang
2014-11-07
Conference Name9th Chinese Conference on Biometric Recognition
Source PublicationCCBR 2014
Conference Date7-9 November 2014
Conference PlaceShenyang, China
AbstractWe address the challenging problem of human pose estimation, which can be adopted as a preprocessing step providing accurate and refined humanpose information for gait recognition and other applications. In this paper, we propose a method and augmented Pose-NMS to process the human pose estimation in the consecutive frames based on a reasonable assumption. The poses between the adjacent frames have small changes. Firstly we merge the multiple estimated pose candidates in a single frame to get the representative pose candidates. Then we propagate the final candidate backward and forward to increase the number of the confident candidates based on the Bayesian theory. We apply our method to the Buffy Video dataset and obtain the competitive result to the state-of-art.
KeywordHuman Pose Estimation Augmented Pose-nms Temporal Clues
Document Type会议论文
Identifierhttp://ir.ia.ac.cn/handle/173211/13252
Collection类脑智能研究中心
Corresponding AuthorZhaoxiang Zhang
Recommended Citation
GB/T 7714
Jianliang Hao,Zhaoxiang Zhang,Yunhong Wang. Enhancing Human Pose Estimation with Temporal Clues[C],2014.
Files in This Item:
There are no files associated with this item.
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Jianliang Hao]'s Articles
[Zhaoxiang Zhang]'s Articles
[Yunhong Wang]'s Articles
Baidu academic
Similar articles in Baidu academic
[Jianliang Hao]'s Articles
[Zhaoxiang Zhang]'s Articles
[Yunhong Wang]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Jianliang Hao]'s Articles
[Zhaoxiang Zhang]'s Articles
[Yunhong Wang]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.