Emotional head motion predicting from prosodic and linguistic features
Yang, Minghao1,2; Jiang, Jinlin1,2; Tao, Jianhua1,2; Mu, Kaihui1,2; Li, Hao1,2
发表期刊MULTIMEDIA TOOLS AND APPLICATIONS
2016-05-01
卷号75期号:9页码:5125-5146
文章类型Article
摘要

Emotional head motion plays an important role in human-computer interaction (HCI), which is one of the important factors to improve users' experience in HCI. However, it is still not clear how head motions are influenced by speech features in different emotion states. In this study, we aim to construct a bimodal mapping model from speech to head motions, and try to discover what kinds of prosodic and linguistic features have the most significant influence on emotional head motions. A two-layer clustering schema is introduced to obtain reliable clusters from head motion parameters. With these clusters, an emotion related speech to head gesture mapping model is constructed by a Classification and Regression Tree (CART). Based on the statistic results of CART, a systematical statistic map of the relationship between speech features (including prosodic and linguistic features) and head gestures is presented. The map reveals the features which have the most significant influence on head motions in long or short utterances. We also make an analysis on how linguistic features contribute to different emotional expressions. The discussions in this work provide important references for realistic animation of speech driven talking-head or avatar.

关键词Visual Prosody Head Gesture Prosody Clustering
WOS标题词Science & Technology ; Technology
DOI10.1007/s11042-016-3405-3
关键词[WOS]AFFECT RECOGNITION ; EXPRESSIONS ; ANIMATION ; DRIVEN ; FACE
收录类别SCI
语种英语
WOS研究方向Computer Science ; Engineering
WOS类目Computer Science, Information Systems ; Computer Science, Software Engineering ; Computer Science, Theory & Methods ; Engineering, Electrical & Electronic
WOS记录号WOS:000376601700018
是否为代表性论文
七大方向——子方向分类人工智能+科学
国重实验室规划方向分类多模态协同认知
是否有论文关联数据集需要存交
引用统计
被引频次:5[WOS]   [WOS记录]     [WOS相关记录]
文献类型期刊论文
条目标识符http://ir.ia.ac.cn/handle/173211/40822
专题多模态人工智能系统全国重点实验室_智能交互
通讯作者Yang, Minghao
作者单位1.Institute of Automation, Chinese Academy of Sciences (CASIA)
2.University of Chinese Academy of Sciences
第一作者单位中国科学院自动化研究所
通讯作者单位中国科学院自动化研究所
推荐引用方式
GB/T 7714
Yang, Minghao,Jiang, Jinlin,Tao, Jianhua,et al. Emotional head motion predicting from prosodic and linguistic features[J]. MULTIMEDIA TOOLS AND APPLICATIONS,2016,75(9):5125-5146.
APA Yang, Minghao,Jiang, Jinlin,Tao, Jianhua,Mu, Kaihui,&Li, Hao.(2016).Emotional head motion predicting from prosodic and linguistic features.MULTIMEDIA TOOLS AND APPLICATIONS,75(9),5125-5146.
MLA Yang, Minghao,et al."Emotional head motion predicting from prosodic and linguistic features".MULTIMEDIA TOOLS AND APPLICATIONS 75.9(2016):5125-5146.
条目包含的文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
2016.7(59)_Emotional(804KB)期刊论文作者接受稿开放获取CC BY-NC-SA浏览
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Yang, Minghao]的文章
[Jiang, Jinlin]的文章
[Tao, Jianhua]的文章
百度学术
百度学术中相似的文章
[Yang, Minghao]的文章
[Jiang, Jinlin]的文章
[Tao, Jianhua]的文章
必应学术
必应学术中相似的文章
[Yang, Minghao]的文章
[Jiang, Jinlin]的文章
[Tao, Jianhua]的文章
相关权益政策
暂无数据
收藏/分享
文件名: 2016.7(59)_Emotional head motion predicting from prosodic and linguistic features_MTAP_EI(SCI)-sml.pdf
格式: Adobe PDF
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。