CASIA OpenIR  > 模式识别国家重点实验室  > 自然语言处理
Inner Attention based Recurrent Neural Networks for Answer Selection
Wang Bingning; Liu Kang; Zhao Jun
2016
会议名称Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics 
卷号Volumn 1, Long Paper
页码1288-1297
会议日期2016
会议地点德国
摘要
Attention based recurrent neural networks have shown advantages in representing natural language sentences (Hermann et al., 2015; Rocktäschel et al., 2015; Tan et al., 2015). Based on recurrent neural networks (RNN), external attention information was added to hidden representations to get an attentive sentence representation. Despite the improvement over non- attentive models, the attention mechanism under RNN is not well studied. In this work, we analyze the deficiency of traditional attention based RNN models quantitatively and qualitatively. Then we present three new RNN models that add attention information before RNN hidden representation, which shows advantage in representing sentence and achieves new state- of-art results in answer selection task.
关键词Answer Selection Question Answering Deep Learning
收录类别EI
语种英语
文献类型会议论文
条目标识符http://ir.ia.ac.cn/handle/173211/20182
专题模式识别国家重点实验室_自然语言处理
通讯作者Liu Kang
作者单位中国科学院自动化研究所
推荐引用方式
GB/T 7714
Wang Bingning,Liu Kang,Zhao Jun. Inner Attention based Recurrent Neural Networks for Answer Selection[C],2016:1288-1297.
条目包含的文件 下载所有文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
Inner Attention base(1729KB)会议论文 开放获取CC BY-NC-SA浏览 下载
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Wang Bingning]的文章
[Liu Kang]的文章
[Zhao Jun]的文章
百度学术
百度学术中相似的文章
[Wang Bingning]的文章
[Liu Kang]的文章
[Zhao Jun]的文章
必应学术
必应学术中相似的文章
[Wang Bingning]的文章
[Liu Kang]的文章
[Zhao Jun]的文章
相关权益政策
暂无数据
收藏/分享
文件名: Inner Attention based Recurrent Neural Networks for Answer Selection.pdf
格式: Adobe PDF
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。