CASIA OpenIR  > 模式识别国家重点实验室  > 语音交互
Emotional head motion predicting from prosodic and linguistic features
Yang, Minghao1; Jiang, Jinlin2; Tao, Jianhua1; Mu, Kaihui1; Li, Hao1
Source PublicationMULTIMEDIA TOOLS AND APPLICATIONS
2016-05-01
Volume75Issue:9Pages:5125-5146
SubtypeArticle
AbstractEmotional head motion plays an important role in human-computer interaction (HCI), which is one of the important factors to improve users' experience in HCI. However, it is still not clear how head motions are influenced by speech features in different emotion states. In this study, we aim to construct a bimodal mapping model from speech to head motions, and try to discover what kinds of prosodic and linguistic features have the most significant influence on emotional head motions. A two-layer clustering schema is introduced to obtain reliable clusters from head motion parameters. With these clusters, an emotion related speech to head gesture mapping model is constructed by a Classification and Regression Tree (CART). Based on the statistic results of CART, a systematical statistic map of the relationship between speech features (including prosodic and linguistic features) and head gestures is presented. The map reveals the features which have the most significant influence on head motions in long or short utterances. We also make an analysis on how linguistic features contribute to different emotional expressions. The discussions in this work provide important references for realistic animation of speech driven talking-head or avatar.
KeywordVisual Prosody Head Gesture Prosody Clustering
WOS HeadingsScience & Technology ; Technology
DOI10.1007/s11042-016-3405-3
WOS KeywordAFFECT RECOGNITION ; EXPRESSIONS ; ANIMATION ; DRIVEN ; FACE
Indexed BySCI
Language英语
Funding OrganizationNational High-Tech Research and Development Program of China (863 Program)(2015AA016305) ; National Natural Science Foundation of China (NSFC)(61332017 ; 61375027 ; 61203258 ; 61273288 ; 61233009 ; 61425017)
WOS Research AreaComputer Science ; Engineering
WOS SubjectComputer Science, Information Systems ; Computer Science, Software Engineering ; Computer Science, Theory & Methods ; Engineering, Electrical & Electronic
WOS IDWOS:000376601700018
Citation statistics
Document Type期刊论文
Identifierhttp://ir.ia.ac.cn/handle/173211/12229
Collection模式识别国家重点实验室_语音交互
Affiliation1.Chinese Acad Sci, Inst Automat, Natl Lab Pattern Recognit, Beijing, Peoples R China
2.Univ Int Business & Econ, Sch Int Studies, Beijing, Peoples R China
First Author AffilicationChinese Acad Sci, Inst Automat, Natl Lab Pattern Recognit, Beijing 100190, Peoples R China
Recommended Citation
GB/T 7714
Yang, Minghao,Jiang, Jinlin,Tao, Jianhua,et al. Emotional head motion predicting from prosodic and linguistic features[J]. MULTIMEDIA TOOLS AND APPLICATIONS,2016,75(9):5125-5146.
APA Yang, Minghao,Jiang, Jinlin,Tao, Jianhua,Mu, Kaihui,&Li, Hao.(2016).Emotional head motion predicting from prosodic and linguistic features.MULTIMEDIA TOOLS AND APPLICATIONS,75(9),5125-5146.
MLA Yang, Minghao,et al."Emotional head motion predicting from prosodic and linguistic features".MULTIMEDIA TOOLS AND APPLICATIONS 75.9(2016):5125-5146.
Files in This Item: Download All
File Name/Size DocType Version Access License
MTAP16 Emotional hea(1730KB)期刊论文作者接受稿开放获取CC BY-NC-SAView Download
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Yang, Minghao]'s Articles
[Jiang, Jinlin]'s Articles
[Tao, Jianhua]'s Articles
Baidu academic
Similar articles in Baidu academic
[Yang, Minghao]'s Articles
[Jiang, Jinlin]'s Articles
[Tao, Jianhua]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Yang, Minghao]'s Articles
[Jiang, Jinlin]'s Articles
[Tao, Jianhua]'s Articles
Terms of Use
No data!
Social Bookmark/Share
File name: MTAP16 Emotional head motion predicting from prosodic and linguistic features-Minghao Yang.pdf
Format: Adobe PDF
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.