Knowledge Commons of Institute of Automation,CAS
Distantsupervisionforrelationextractionwithhierarchicalselective attention | |
Peng Zhou1,2; Jiaming Xu1![]() ![]() ![]() ![]() | |
发表期刊 | NeuralNetworks
![]() |
2018 | |
期号 | 108页码:240-247 |
文章类型 | Full Paper |
摘要 | Distant supervised relation extraction is an important task in the field of natural language processing. There are two main shortcomings for most state-of-the-art methods.One is that they take all sentences of an entity pair as input,which would result in a large computationa lcost.But in fact,few of most relevant sentences are enough to recognize there lation of an entity pair.To tackle these problems,we propose a novel hierarchical selective attention network for relation extraction under distant supervision. Our model first selects most relevant sentences by taking coarse sentence-level attention on all sentences of an entity pair and then employs word-level attention to construct sentence representations and fine sentence-level attention to aggregate the sesentence representations.Experimental results on a widely used dataset demonstrate that our method performs significantly better than most of existing methods. |
关键词 | Relationextraction Distantsupervision Hierarchicalattention Piecewiseconvolutionalneuralnetworks |
语种 | 英语 |
文献类型 | 期刊论文 |
条目标识符 | http://ir.ia.ac.cn/handle/173211/40656 |
专题 | 复杂系统认知与决策实验室_听觉模型与认知计算 |
通讯作者 | Zhenyu Qi |
作者单位 | 1.CASIA 2.UniversityofChineseAcademyofSciences(UCAS),China 3.CenterforExcellenceinBrainScienceandIntelligenceTechnology,CAS,China |
第一作者单位 | 中国科学院自动化研究所 |
通讯作者单位 | 中国科学院自动化研究所 |
推荐引用方式 GB/T 7714 | Peng Zhou,Jiaming Xu,Zhenyu Qi,et al. Distantsupervisionforrelationextractionwithhierarchicalselective attention[J]. NeuralNetworks,2018(108):240-247. |
APA | Peng Zhou,Jiaming Xu,Zhenyu Qi,Hongyun Bao,Zhineng Chen,&Bo xu.(2018).Distantsupervisionforrelationextractionwithhierarchicalselective attention.NeuralNetworks(108),240-247. |
MLA | Peng Zhou,et al."Distantsupervisionforrelationextractionwithhierarchicalselective attention".NeuralNetworks .108(2018):240-247. |
条目包含的文件 | 下载所有文件 | |||||
文件名称/大小 | 文献类型 | 版本类型 | 开放类型 | 使用许可 | ||
2018 Neural Networks(663KB) | 期刊论文 | 作者接受稿 | 开放获取 | CC BY-NC-SA | 浏览 下载 |
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论