CASIA OpenIR  > 数字内容技术与服务研究中心  > 听觉模型与认知计算
Knowledge Aware Emotion Recognition in Textual Conversations via Multi-Task Incremental Transformer
Zhang, Duzhen1,2; Chen, Xiuyi1,2; Xu, Shuang1; Xu, Bo1,2
2020-12
Conference NameProceedings of the 28th International Conference on Computational Linguistics
Conference Date2020-12
Conference PlaceBarcelona, Spain (Online)
Abstract

 

Emotion recognition in textual conversations (ERTC) plays an important role in a wide range of applications, such as opinion mining, recommender systems, and so on. ERTC, however, is a challenging task. For one thing, speakers often rely on the context and commonsense knowledge to express emotions; for another, most utterances contain neutral emotion in conversations, as a result, the confusion between a few non-neutral utterances and much more neutral ones restrains the emotion recognition performance. In this paper, we propose a novel Knowledge Aware Incremental Transformer with Multi-task Learning (KAITML) to address these challenges. Firstly, we devise a dual-level graph attention mechanism to leverage commonsense knowledge, which augments the semantic information of the utterance. Then we apply the Incremental Transformer to encode multi-turn contextual utterances. Moreover, we are the first to introduce multi-task learning to alleviate the aforementioned confusion and thus further improve the emotion recognition performance. Extensive experimental results show that our KAITML model outperforms the state-of-the-art models across five benchmark datasets. 

Document Type会议论文
Identifierhttp://ir.ia.ac.cn/handle/173211/48920
Collection数字内容技术与服务研究中心_听觉模型与认知计算
Affiliation1.Institute of Automation, Chinese Academy of Sciences (CASIA). Beijing, China
2.School of Artificial Intelligence, University of Chinese Academy of Sciences
First Author AffilicationInstitute of Automation, Chinese Academy of Sciences
Recommended Citation
GB/T 7714
Zhang, Duzhen,Chen, Xiuyi,Xu, Shuang,et al. Knowledge Aware Emotion Recognition in Textual Conversations via Multi-Task Incremental Transformer[C],2020.
Files in This Item: Download All
File Name/Size DocType Version Access License
1322_Paper.pdf(1596KB)会议论文 开放获取CC BY-NC-SAView Download
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Zhang, Duzhen]'s Articles
[Chen, Xiuyi]'s Articles
[Xu, Shuang]'s Articles
Baidu academic
Similar articles in Baidu academic
[Zhang, Duzhen]'s Articles
[Chen, Xiuyi]'s Articles
[Xu, Shuang]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Zhang, Duzhen]'s Articles
[Chen, Xiuyi]'s Articles
[Xu, Shuang]'s Articles
Terms of Use
No data!
Social Bookmark/Share
File name: 1322_Paper.pdf
Format: Adobe PDF
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.