CASIA OpenIR  > 模式识别国家重点实验室  > 语音交互
Bayesian Inference based Temporal Modeling for Naturalistic Affective Expression Classification
Linlin Chao; Jianhua Tao; Minghao Yang
2013
会议名称Humaine Association Conference on Affective Computing and Intelligent Interaction, ACII 2013
会议录名称2013 Humaine Association Conference on Affective Computing and Intelligent Interaction
页码173-178
会议日期2013-9
会议地点Geneva, Switzerland
摘要1; in real life, the affective state of human beings changes gradually and smoothly. There is a high probability that the affective state of a certain moment depends on the states of a previous period. In this study, we propose to explicitly model the temporal relationship using a Bayesian inference based two-stage classification approach. This approach could involve knowledge about the dynamics of affective states during a period, so that the inferred affective states are predicted by considering a certain amount of context. Evaluations on the Audio Sub-Challenge of the 2011 Audio/Visual Emotion Challenge show our approach obtains competitive results to those of Audio Sub-Challenge winners. The temporal context modeling method proposed in this paper is also helpful for other sequential pattern recognition problems.
关键词Affect Dimensions
收录类别EI
语种英语
文献类型会议论文
条目标识符http://ir.ia.ac.cn/handle/173211/11844
专题模式识别国家重点实验室_语音交互
通讯作者Linlin Chao
作者单位Institute of Automation, Chinese Academy of Sciences
推荐引用方式
GB/T 7714
Linlin Chao,Jianhua Tao,Minghao Yang. Bayesian Inference based Temporal Modeling for Naturalistic Affective Expression Classification[C],2013:173-178.
条目包含的文件 下载所有文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
Multimodal Emotion R(236KB)会议论文 开放获取CC BY-NC-SA浏览 下载
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Linlin Chao]的文章
[Jianhua Tao]的文章
[Minghao Yang]的文章
百度学术
百度学术中相似的文章
[Linlin Chao]的文章
[Jianhua Tao]的文章
[Minghao Yang]的文章
必应学术
必应学术中相似的文章
[Linlin Chao]的文章
[Jianhua Tao]的文章
[Minghao Yang]的文章
相关权益政策
暂无数据
收藏/分享
文件名: Multimodal Emotion Recognition_IEEE_ACII_2013_EI.pdf
格式: Adobe PDF
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。