Adversarial Training for Relation Classification with Attention based Gate Mechanism
Pengfei Cao1,2; Yubo Chen1,2; Kang Liu1,2; Jun Zhao1,2
2018-07-20
会议名称The China Conference on Knowledge Graph and Semantic Computing
会议日期14-17, August, 2018
会议地点Tianjin, China
摘要

In recent years, deep neural networks have achieved significant success in relation classification and many other natural language processing tasks. However, existing neural networks for relation classification heavily rely on the quality of labelled data and tend to be overconfident about the noise in input signals. They may be limited in robustness and generalization. In this paper, we apply adversarial training to the relation classification by adding perturbations to the input vectors in bidirectional long short-term memory neural networks rather than to the original input itself. Besides, we propose an attention based gate module, which can not only discern the important information when learning the sentence representations but also adaptively concatenate sentence level and lexical level features. Experiments on the SemEval-2010 Task 8 benchmark dataset show that our model significantly outperforms other state-of-the-art models.

收录类别EI
语种英语
七大方向——子方向分类自然语言处理
国重实验室规划方向分类语音语言处理
是否有论文关联数据集需要存交
文献类型会议论文
条目标识符http://ir.ia.ac.cn/handle/173211/52186
专题多模态人工智能系统全国重点实验室_自然语言处理
作者单位1.University of Chinese Academy of Sciences
2.Institute of Automation Chinese Academy of Sciences
第一作者单位中国科学院自动化研究所
推荐引用方式
GB/T 7714
Pengfei Cao,Yubo Chen,Kang Liu,et al. Adversarial Training for Relation Classification with Attention based Gate Mechanism[C],2018.
条目包含的文件 下载所有文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
9-CCKS2018-第一作者.pdf(985KB)会议论文 开放获取CC BY-NC-SA浏览 下载
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Pengfei Cao]的文章
[Yubo Chen]的文章
[Kang Liu]的文章
百度学术
百度学术中相似的文章
[Pengfei Cao]的文章
[Yubo Chen]的文章
[Kang Liu]的文章
必应学术
必应学术中相似的文章
[Pengfei Cao]的文章
[Yubo Chen]的文章
[Kang Liu]的文章
相关权益政策
暂无数据
收藏/分享
文件名: 9-CCKS2018-第一作者.pdf
格式: Adobe PDF
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。