CASIA OpenIR  > 模式识别国家重点实验室  > 语音交互
Emotional head motion predicting from prosodic and linguistic features
Yang, Minghao1; Jiang, Jinlin2; Tao, Jianhua1; Mu, Kaihui1; Li, Hao1
2016-05-01
发表期刊MULTIMEDIA TOOLS AND APPLICATIONS
卷号75期号:9页码:5125-5146
文章类型Article
摘要Emotional head motion plays an important role in human-computer interaction (HCI), which is one of the important factors to improve users' experience in HCI. However, it is still not clear how head motions are influenced by speech features in different emotion states. In this study, we aim to construct a bimodal mapping model from speech to head motions, and try to discover what kinds of prosodic and linguistic features have the most significant influence on emotional head motions. A two-layer clustering schema is introduced to obtain reliable clusters from head motion parameters. With these clusters, an emotion related speech to head gesture mapping model is constructed by a Classification and Regression Tree (CART). Based on the statistic results of CART, a systematical statistic map of the relationship between speech features (including prosodic and linguistic features) and head gestures is presented. The map reveals the features which have the most significant influence on head motions in long or short utterances. We also make an analysis on how linguistic features contribute to different emotional expressions. The discussions in this work provide important references for realistic animation of speech driven talking-head or avatar.
关键词Visual Prosody Head Gesture Prosody Clustering
WOS标题词Science & Technology ; Technology
DOI10.1007/s11042-016-3405-3
关键词[WOS]AFFECT RECOGNITION ; EXPRESSIONS ; ANIMATION ; DRIVEN ; FACE
收录类别SCI
语种英语
项目资助者National High-Tech Research and Development Program of China (863 Program)(2015AA016305) ; National Natural Science Foundation of China (NSFC)(61332017 ; 61375027 ; 61203258 ; 61273288 ; 61233009 ; 61425017)
WOS研究方向Computer Science ; Engineering
WOS类目Computer Science, Information Systems ; Computer Science, Software Engineering ; Computer Science, Theory & Methods ; Engineering, Electrical & Electronic
WOS记录号WOS:000376601700018
引用统计
文献类型期刊论文
条目标识符http://ir.ia.ac.cn/handle/173211/12229
专题模式识别国家重点实验室_语音交互
作者单位1.Chinese Acad Sci, Inst Automat, Natl Lab Pattern Recognit, Beijing, Peoples R China
2.Univ Int Business & Econ, Sch Int Studies, Beijing, Peoples R China
推荐引用方式
GB/T 7714
Yang, Minghao,Jiang, Jinlin,Tao, Jianhua,et al. Emotional head motion predicting from prosodic and linguistic features[J]. MULTIMEDIA TOOLS AND APPLICATIONS,2016,75(9):5125-5146.
APA Yang, Minghao,Jiang, Jinlin,Tao, Jianhua,Mu, Kaihui,&Li, Hao.(2016).Emotional head motion predicting from prosodic and linguistic features.MULTIMEDIA TOOLS AND APPLICATIONS,75(9),5125-5146.
MLA Yang, Minghao,et al."Emotional head motion predicting from prosodic and linguistic features".MULTIMEDIA TOOLS AND APPLICATIONS 75.9(2016):5125-5146.
条目包含的文件 下载所有文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
MTAP16 Emotional hea(1730KB)期刊论文作者接受稿开放获取CC BY-NC-SA浏览 下载
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Yang, Minghao]的文章
[Jiang, Jinlin]的文章
[Tao, Jianhua]的文章
百度学术
百度学术中相似的文章
[Yang, Minghao]的文章
[Jiang, Jinlin]的文章
[Tao, Jianhua]的文章
必应学术
必应学术中相似的文章
[Yang, Minghao]的文章
[Jiang, Jinlin]的文章
[Tao, Jianhua]的文章
相关权益政策
暂无数据
收藏/分享
文件名: MTAP16 Emotional head motion predicting from prosodic and linguistic features-Minghao Yang.pdf
格式: Adobe PDF
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。