CASIA OpenIR
A model-based gait recognition method with body pose and human prior knowledge
Liao, Rijun1,3; Yu, Shiqi2,3; An, Weizhi1,3; Huang, Yongzhen4,5
Source PublicationPATTERN RECOGNITION
ISSN0031-3203
2020-02-01
Volume98Pages:11
Corresponding AuthorYu, Shiqi(yusq@sustech.edu.cn)
AbstractWe propose in this paper a novel model-based gait recognition method, PoseGait. Gait recognition is a challenging and attractive task in biometrics. Early approaches to gait recognition were mainly appearance-based. The appearance-based features are usually extracted from human body silhouettes, which are easy to compute and have shown to be efficient for recognition tasks. Nevertheless silhouettes shape is not invariant to changes in clothing, and can be subject to drastic variations, due to illumination changes or other external factors. An alternative to silhouette-based features are model-based features. However, they are very challenging to acquire especially for low image resolution. In contrast to previous approaches, our model PoseGait exploits human 3D pose estimated from images by Convolutional Neural Network as the input feature for gait recognition. The 3D pose, defined by the 3D coordinates of joints of the human body, is invariant to view changes and other external factors of variation. We design spatio-temporal features from the 3D pose to improve the recognition rate. Our method is evaluated on two large datasets, CASIA B and CASIA E. The experimental results show that the proposed method can achieve state-of-the-art performance and is robust to view and clothing variations. (C) 2019 Elsevier Ltd. All rights reserved.
KeywordGait recognition Human body pose Spatio-temporal feature
DOI10.1016/j.patcog.2019.107069
WOS KeywordIMAGE ; TRANSFORMATION ; BIOMETRICS
Indexed BySCI
Language英语
Funding ProjectNational Natural Science Foundation of China[61976144] ; Science Foundation of Shenzhen[20170504160426188] ; National Natural Science Foundation of China[61976144] ; Science Foundation of Shenzhen[20170504160426188]
Funding OrganizationNational Natural Science Foundation of China ; Science Foundation of Shenzhen
WOS Research AreaComputer Science ; Engineering
WOS SubjectComputer Science, Artificial Intelligence ; Engineering, Electrical & Electronic
WOS IDWOS:000497600300005
PublisherELSEVIER SCI LTD
Citation statistics
Cited Times:5[WOS]   [WOS Record]     [Related Records in WOS]
Document Type期刊论文
Identifierhttp://ir.ia.ac.cn/handle/173211/29376
Collection中国科学院自动化研究所
Corresponding AuthorYu, Shiqi
Affiliation1.Shenzhen Univ, Coll Comp Sci & Software Engn, Shenzhen, Guangdong, Peoples R China
2.Southern Univ Sci & Technol, Dept Comp Sci & Engn, Shenzhen, Guangdong, Peoples R China
3.Shenzhen Inst Artificial Intelligence & Robot Soc, Shenzhen, Guangdong, Peoples R China
4.Chinese Acad Sci, Inst Automat, Natl Lab Pattern Recognit, Beijing, Peoples R China
5.Watrix Technol Ltd Co Ltd, Beijing, Peoples R China
Recommended Citation
GB/T 7714
Liao, Rijun,Yu, Shiqi,An, Weizhi,et al. A model-based gait recognition method with body pose and human prior knowledge[J]. PATTERN RECOGNITION,2020,98:11.
APA Liao, Rijun,Yu, Shiqi,An, Weizhi,&Huang, Yongzhen.(2020).A model-based gait recognition method with body pose and human prior knowledge.PATTERN RECOGNITION,98,11.
MLA Liao, Rijun,et al."A model-based gait recognition method with body pose and human prior knowledge".PATTERN RECOGNITION 98(2020):11.
Files in This Item:
There are no files associated with this item.
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Liao, Rijun]'s Articles
[Yu, Shiqi]'s Articles
[An, Weizhi]'s Articles
Baidu academic
Similar articles in Baidu academic
[Liao, Rijun]'s Articles
[Yu, Shiqi]'s Articles
[An, Weizhi]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Liao, Rijun]'s Articles
[Yu, Shiqi]'s Articles
[An, Weizhi]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.