Knowledge-aware Attentive Wasserstein Adversarial Dialogue Response Generation
Zhang, Yingying1,2; Fang, Quan1,3; Qian, Shengsheng1,3; Xu, Changsheng1,2,4
发表期刊ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY
ISSN2157-6904
2020-07-01
卷号11期号:4页码:20
摘要

Natural language generation has become a fundamental task in dialogue systems. RNN-based natural response generation methods encode the dialogue context and decode it into a response. However, they tend to generate dull and simple responses. In this article, we propose a novel framework, called KAWA-DRG (Knowledge-aware Attentive Wasserstein Adversarial Dialogue Response Generation) to model conversation-specific external knowledge and the importance variances of dialogue context in a unified adversarial encoder-decoder learning framework. In KAWA-DRG, a co-attention mechanism attends to important parts within and among context utterances with word-utterance-level attention. Prior knowledge is integrated into the conditional Wasserstein auto-encoder for learning the latent variable space. The posterior and prior distribution of latent variables are generated and trained through adversarial learning. We evaluate our model on Switchboard, DailyDialog, In-Car Assistant, and Ubuntu Dialogue Corpus. Experimental results show that KAWA-DRG outperforms the existing methods.

关键词Dialogue system co-attention adversarial learning external knowledge
DOI10.1145/3384675
收录类别SCI
语种英语
资助项目National Key Research and Development Program of China[2017YFB1002804] ; National Natural Science Foundation of China[61720106006] ; National Natural Science Foundation of China[61572503] ; National Natural Science Foundation of China[61802405] ; National Natural Science Foundation of China[61872424] ; National Natural Science Foundation of China[61702509] ; National Natural Science Foundation of China[61832002] ; National Natural Science Foundation of China[61936005] ; National Natural Science Foundation of China[U1705262] ; Key Research Program of Frontier Sciences, CAS[QYZDJ-SSW-JSC039] ; K.C. Wong Education Foundation
项目资助者National Key Research and Development Program of China ; National Natural Science Foundation of China ; Key Research Program of Frontier Sciences, CAS ; K.C. Wong Education Foundation
WOS研究方向Computer Science
WOS类目Computer Science, Artificial Intelligence ; Computer Science, Information Systems
WOS记录号WOS:000583127700002
出版者ASSOC COMPUTING MACHINERY
七大方向——子方向分类自然语言处理
引用统计
被引频次:3[WOS]   [WOS记录]     [WOS相关记录]
文献类型期刊论文
条目标识符http://ir.ia.ac.cn/handle/173211/41806
专题多模态人工智能系统全国重点实验室_多媒体计算
通讯作者Xu, Changsheng
作者单位1.Chinese Acad Sci, Natl Lab Pattern Recognit, Inst Automat, Beijing, Peoples R China
2.Univ Chinese Acad Sci, Sch Artificial Intelligence, 95 ZhongGuanChun East Rd, Beijing 100190, Peoples R China
3.Univ Chinese Acad Sci, 95 ZhongGuanChun East Rd, Beijing 100190, Peoples R China
4.Peng Cheng Lab, 95 ZhongGuanChun East Rd, Beijing 100190, Peoples R China
第一作者单位模式识别国家重点实验室
通讯作者单位模式识别国家重点实验室
推荐引用方式
GB/T 7714
Zhang, Yingying,Fang, Quan,Qian, Shengsheng,et al. Knowledge-aware Attentive Wasserstein Adversarial Dialogue Response Generation[J]. ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY,2020,11(4):20.
APA Zhang, Yingying,Fang, Quan,Qian, Shengsheng,&Xu, Changsheng.(2020).Knowledge-aware Attentive Wasserstein Adversarial Dialogue Response Generation.ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY,11(4),20.
MLA Zhang, Yingying,et al."Knowledge-aware Attentive Wasserstein Adversarial Dialogue Response Generation".ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY 11.4(2020):20.
条目包含的文件 下载所有文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
Knowledge-aware Atte(1626KB)期刊论文作者接受稿开放获取CC BY-NC-SA浏览 下载
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Zhang, Yingying]的文章
[Fang, Quan]的文章
[Qian, Shengsheng]的文章
百度学术
百度学术中相似的文章
[Zhang, Yingying]的文章
[Fang, Quan]的文章
[Qian, Shengsheng]的文章
必应学术
必应学术中相似的文章
[Zhang, Yingying]的文章
[Fang, Quan]的文章
[Qian, Shengsheng]的文章
相关权益政策
暂无数据
收藏/分享
文件名: Knowledge-aware Attentive Wasserstein Adversarial.pdf
格式: Adobe PDF
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。