CASIA OpenIR  > 智能感知与计算研究中心
Structure Sparsity for Multi-camera Gait Recognition
Qiyue Yin1; Rong Sun1; Liang Wang2; Ran He(赫然)2
Conference NameChinese Conference on Pattern Recognition
Conference Date2012-9-24
Conference Place中国北京
With the rapid development of surveillance technology, there are often several cameras in one scenario. The multi-camera usage to perform gait recognition becomes a challenge problem. This paper studies multi-camera gait recognition via structure sparsity. For the multicamera structure in the training set, we propose a structure sparsity algorithm to learn informative and discriminative sparse representations; and for the structure in the testing set, we develop a new classification criteria based on the reconstruction error of learned sparse representations. In addition, we learn a dictionary from the original gait data to further improve recognition accuracy meanwhile reduce computational cost. Experimental results show that the proposed method can efficiently deal with the multi-camera gait recognition problem and outperforms the state-of-the-art sparse representation methods.
Document Type会议论文
Recommended Citation
GB/T 7714
Qiyue Yin,Rong Sun,Liang Wang,et al. Structure Sparsity for Multi-camera Gait Recognition[C],2012:259-267.
Files in This Item: Download All
File Name/Size DocType Version Access License
Structured sparsity (256KB)会议论文 开放获取CC BY-NC-SAView Download
Related Services
Recommend this item
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Qiyue Yin]'s Articles
[Rong Sun]'s Articles
[Liang Wang]'s Articles
Baidu academic
Similar articles in Baidu academic
[Qiyue Yin]'s Articles
[Rong Sun]'s Articles
[Liang Wang]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Qiyue Yin]'s Articles
[Rong Sun]'s Articles
[Liang Wang]'s Articles
Terms of Use
No data!
Social Bookmark/Share
File name: Structured sparsity for multi-camera gait recognition.pdf
Format: Adobe PDF
All comments (0)
No comment.

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.