CASIA OpenIR  > 智能感知与计算研究中心
Representation Learning of Temporal Dynamics for Skeleton-Based Action Recognition
Du, Yong1; Fu, Yun2,3; Wang, Liang1,4
AbstractMotion characteristics of human actions can be represented by the position variation of skeleton joints. Traditional approaches generally extract the spatial-temporal representation of the skeleton sequences with well-designed hand-crafted features. In this paper, in order to recognize actions according to the relative motion between the limbs and the trunk, we propose an end-to-end hierarchical RNN for skeleton-based action recognition. We divide human skeleton into five main parts in terms of the human physical structure, and then feed them to five independent subnets for local feature extraction. After the following hierarchical feature fusion and extraction from local to global, dimensions of the final temporal dynamics representations are reduced to the same number of action categories in the corresponding data set through a single-layer perceptron. In addition, the output of the perceptron is temporally accumulated as the input of a softmax layer for classification. Random scale and rotation transformations are employed to improve the robustness during training. We compare with five other deep RNN variants derived from our model in order to verify the effectiveness of the proposed network. In addition, we compare with several other methods on motion capture and Kinect data sets. Furthermore, we evaluate the robustness of our model trained with random scale and rotation transformations for a multiview problem. Experimental results demonstrate that our model achieves the state-of-the-art performance with high computational efficiency.
KeywordAction Recognition Hierarchical Recurrent Neural Network Random Scale & Rotation Transformations Skeleton
WOS HeadingsScience & Technology ; Technology
Indexed BySCI
Funding OrganizationNational Basic Research Program of China(2012CB316300) ; National Science Foundation(1314484) ; Strategic Priority Research Program within the Chinese Academy of Sciences(XDB02070100) ; National Natural Science Foundation of China(61525306 ; 61420106015)
WOS Research AreaComputer Science ; Engineering
WOS SubjectComputer Science, Artificial Intelligence ; Engineering, Electrical & Electronic
WOS IDWOS:000376087700006
Citation statistics
Cited Times:24[WOS]   [WOS Record]     [Related Records in WOS]
Document Type期刊论文
Corresponding AuthorWang, Liang
Affiliation1.Chinese Acad Sci, Inst Automat, Ctr Res Intelligent Percept & Comp, Natl Lab Pattern Recognit, Beijing 100190, Peoples R China
2.Northeastern Univ, Dept Elect & Comp Engn, Coll Engn, Boston, MA 02115 USA
3.Northeastern Univ, Coll Comp & Informat Sci, Boston, MA 02115 USA
4.CAS Ctr Excellence Brain Sci & Intelligence Techn, Beijing 100190, Peoples R China
Recommended Citation
GB/T 7714
Du, Yong,Fu, Yun,Wang, Liang. Representation Learning of Temporal Dynamics for Skeleton-Based Action Recognition[J]. IEEE TRANSACTIONS ON IMAGE PROCESSING,2016,25(7):3010-3022.
APA Du, Yong,Fu, Yun,&Wang, Liang.(2016).Representation Learning of Temporal Dynamics for Skeleton-Based Action Recognition.IEEE TRANSACTIONS ON IMAGE PROCESSING,25(7),3010-3022.
MLA Du, Yong,et al."Representation Learning of Temporal Dynamics for Skeleton-Based Action Recognition".IEEE TRANSACTIONS ON IMAGE PROCESSING 25.7(2016):3010-3022.
Files in This Item: Download All
File Name/Size DocType Version Access License
TIP_paper_published.(4688KB)期刊论文作者接受稿开放获取CC BY-NC-SAView Download
Related Services
Recommend this item
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Du, Yong]'s Articles
[Fu, Yun]'s Articles
[Wang, Liang]'s Articles
Baidu academic
Similar articles in Baidu academic
[Du, Yong]'s Articles
[Fu, Yun]'s Articles
[Wang, Liang]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Du, Yong]'s Articles
[Fu, Yun]'s Articles
[Wang, Liang]'s Articles
Terms of Use
No data!
Social Bookmark/Share
File name: TIP_paper_published.pdf
Format: Adobe PDF
All comments (0)
No comment.

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.