CASIA OpenIR  > 智能感知与计算研究中心
Human Activity Recognition Based on R Transform
Wang Ying; Kaiqi Huang; Tieniu Tan
2007
Conference NameCVPR workshop on the Seventh International Workshop on Visual Surveillance
Source PublicationCVPR workshop on the Seventh International Workshop on Visual Surveillance
Pages1-8
Conference Date2007-06-01
Conference PlaceMinneapolis, Minnesota, USA
AbstractThis paper addresses human activity recognition based on a new feature descriptor. For a binary human silhouette, an extended radon transform, R transform, is employed to represent low-level features. The advantage of the R transform lies in its low computational complexity and geometric invariance. Then a set of HMMs based on the extracted features are trained to recognize activities. Compared with other commonly-used feature descriptors, R transform is robust to frame loss in video, disjoint silhouettes and holes in the shape, and thus achieves better performance in recognizing similar activities. Rich experiments have proved the efficiency of the proposed method.
KeywordRadon Transforms   feature Extraction   hidden Markov Models 
Language英语
Document Type会议论文
Identifierhttp://ir.ia.ac.cn/handle/173211/12723
Collection智能感知与计算研究中心
Corresponding AuthorKaiqi Huang
Affiliation中国科学院自动化研究所
Recommended Citation
GB/T 7714
Wang Ying,Kaiqi Huang,Tieniu Tan. Human Activity Recognition Based on R Transform[C],2007:1-8.
Files in This Item:
There are no files associated with this item.
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Wang Ying]'s Articles
[Kaiqi Huang]'s Articles
[Tieniu Tan]'s Articles
Baidu academic
Similar articles in Baidu academic
[Wang Ying]'s Articles
[Kaiqi Huang]'s Articles
[Tieniu Tan]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Wang Ying]'s Articles
[Kaiqi Huang]'s Articles
[Tieniu Tan]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.