Knowledge Commons of Institute of Automation,CAS
Event Detection via Gated Multilingual Attention Mechanism | |
Liu, Jian1,2![]() ![]() ![]() ![]() | |
2018-02 | |
会议名称 | AAAI |
页码 | 4865-4872 |
会议日期 | 2018-02 |
会议地点 | New Orleans |
摘要 | Identifying event instance in text plays a critical role in building NLP applications such as Information Extraction (IE) system. However, most existing methods for this task focus only on monolingual clues of a specific language and ignore the massive information provided by other languages. Data scarcity and monolingual ambiguity hinder the performance of these monolingual approaches. In this paper, we propose a novel multilingual approach---dubbed as Gated Multilingual Attention (GMLATT) framework---to address the two issues simultaneously. In specific, to alleviate data scarcity problem, we exploit the consistent information in multilingual data via context attention mechanism. Which takes advantage of the consistent evidence in multilingual data other than learning only from monolingual data. To deal with monolingual ambiguity problem, we propose gated cross-lingual attention to exploit the complement information conveyed by multilingual data, which is helpful for the disambiguation. The cross-lingual attention gate serves as a sentinel modelling the confidence of the clues provided by other languages and controls the information integration of various languages. We have conducted extensive experiments on the ACE 2005 benchmark. Experimental results show that our approach significantly outperforms state-of-the-art methods. |
收录类别 | EI |
语种 | 英语 |
文献类型 | 会议论文 |
条目标识符 | http://ir.ia.ac.cn/handle/173211/39210 |
专题 | 多模态人工智能系统全国重点实验室_自然语言处理 |
作者单位 | 1.中国科学院自动化研究所 2.中国科学院大学 |
第一作者单位 | 中国科学院自动化研究所 |
推荐引用方式 GB/T 7714 | Liu, Jian,Chen, Yubo,Liu, Kang,et al. Event Detection via Gated Multilingual Attention Mechanism[C],2018:4865-4872. |
条目包含的文件 | 下载所有文件 | |||||
文件名称/大小 | 文献类型 | 版本类型 | 开放类型 | 使用许可 | ||
16371-76777-1-PB.pdf(718KB) | 会议论文 | 开放获取 | CC BY-NC-SA | 浏览 下载 |
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论