Reading selectively via Binary Input Gated Recurrent Unit
Li Z(李哲); Wang PS(王培松); Lu HQ(卢汉清); Cheng J(程健)
2019-08
会议名称International Joint Conference on Artificial Intelligence
会议日期2019-08
会议地点中国澳门
摘要

Recurrent Neural Networks (RNNs) have shown great promise in sequence modeling tasks. Gated Recurrent Unit (GRU) is one of the most used recurrent structures, which makes a good trade-off between performance and time spent. However, its practical implementation based on soft gates only partially achieves the goal to control information flow. We can hardly explain what the network has learnt internally. Inspired by human reading, we introduce binary input gated recurrent unit (BIGRU), a GRU based model using a binary input gate instead of the reset gate in GRU. By doing so, our model can read selectively during interference. In our experiments, we show that BIGRU mainly ignores the conjunctions, adverbs and articles that do not make a big difference to the document understanding, which is meaningful for us to further understand how the network works. In addition, due to reduced interference from redundant information, our model achieves better performances than baseline GRU in all the testing tasks.

语种英语
七大方向——子方向分类AI芯片与智能计算
文献类型会议论文
条目标识符http://ir.ia.ac.cn/handle/173211/23693
专题紫东太初大模型研究中心_图像与视频分析
通讯作者Li Z(李哲)
作者单位中国科学院自动化研究所
第一作者单位中国科学院自动化研究所
通讯作者单位中国科学院自动化研究所
推荐引用方式
GB/T 7714
Li Z,Wang PS,Lu HQ,et al. Reading selectively via Binary Input Gated Recurrent Unit[C],2019.
条目包含的文件 下载所有文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
ijcai19.pdf(1207KB)会议论文 开放获取CC BY-NC-SA浏览 下载
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Li Z(李哲)]的文章
[Wang PS(王培松)]的文章
[Lu HQ(卢汉清)]的文章
百度学术
百度学术中相似的文章
[Li Z(李哲)]的文章
[Wang PS(王培松)]的文章
[Lu HQ(卢汉清)]的文章
必应学术
必应学术中相似的文章
[Li Z(李哲)]的文章
[Wang PS(王培松)]的文章
[Lu HQ(卢汉清)]的文章
相关权益政策
暂无数据
收藏/分享
文件名: ijcai19.pdf
格式: Adobe PDF
此文件暂不支持浏览
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。