Knowledge-Enhanced Natural Language Inference Based on Knowledge Graphs
Wang, Zikang1,2; Li, Linjing1,2,3; Zeng, Daniel1,2,3
2020-12
会议名称International Conference on Computational Linguistics
会议日期2020.12.8-13
会议地点在线
摘要

Natural Language Inference (NLI) is a vital task in natural language processing. It aims to iden- tify the logical relationship between two sentences. Most of the existing approaches make such inference based on semantic knowledge obtained through training corpus. The adoption of background knowledge is rarely seen or limited to a few specific types. In this paper, we propose a novel Knowledge Graph-enhanced NLI (KGNLI) model to leverage the usage of background knowledge stored in knowledge graphs in the field of NLI. KGNLI model consists of three components: a semantic-relation representation module, a knowledge-relation representation module, and a label prediction module. Different from previous methods, various kinds of background knowledge can be flexibly combined in the proposed KGNLI model. Experiments on four benchmarks, SNLI, MultiNLI, SciTail, and BNLI, validate the effectiveness of our model.

七大方向——子方向分类自然语言处理
文献类型会议论文
条目标识符http://ir.ia.ac.cn/handle/173211/44376
专题多模态人工智能系统全国重点实验室_互联网大数据与信息安全
作者单位1.State Key Laboratory of Management and Control for Complex Systems, Institute of Automation, Chinese Academy of Sciences
2.University of Chinese Academy of Sciences
3.Shenzhen Artificial Intelligence and Data Science Institute (Longhua)
第一作者单位中国科学院自动化研究所
推荐引用方式
GB/T 7714
Wang, Zikang,Li, Linjing,Zeng, Daniel. Knowledge-Enhanced Natural Language Inference Based on Knowledge Graphs[C],2020.
条目包含的文件 下载所有文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
2020_coling_nli.pdf(491KB)会议论文 开放获取CC BY-NC-SA浏览 下载
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Wang, Zikang]的文章
[Li, Linjing]的文章
[Zeng, Daniel]的文章
百度学术
百度学术中相似的文章
[Wang, Zikang]的文章
[Li, Linjing]的文章
[Zeng, Daniel]的文章
必应学术
必应学术中相似的文章
[Wang, Zikang]的文章
[Li, Linjing]的文章
[Zeng, Daniel]的文章
相关权益政策
暂无数据
收藏/分享
文件名: 2020_coling_nli.pdf
格式: Adobe PDF
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。