Memory-Augmented Attention Model for Scene Text Recognition
Wang, Cong1,2; Yin, Fei1,2; Liu, Cheng-Lin1,2,3
2018
会议名称The 16th International Conference on Frontiers in Handwriting Recognition (ICFHR)
会议日期August 5-8, 2018
会议地点Niagara Falls, USA
摘要

Natural scene text recognition is a very challenging task. Attention-based encoder-decoder framework has achieved the state-of-the-art performance. However, for some complex and/or low-quality images, the alignments estimated by the content-based attention network are not accurate enough, and so, the generated glimpse vector is also not powerful enough to represent the predicted character at current time step. To solve this problem, in the paper we propose a memory-augmented attention model for scene text recognition. The proposed memory-augmented attention network (MAAN) feeds the part of character sequence already generated and all attended alignment history to the attention model when predicting the character at current time step. The whole network can be trained end-to-end. Experimental results on several challenging benchmark datasets demonstrate that the proposed memory-augmented attention model for scene text recognition can achieve a comparable or better performance compared with state-of-the-art methods.

关键词Scene Text Recognition Attention Network Memory Augmentation
收录类别EI
语种英语
文献类型会议论文
条目标识符http://ir.ia.ac.cn/handle/173211/38554
专题多模态人工智能系统全国重点实验室_模式分析与学习
作者单位1.中国科学院自动化研究所
2.中国科学院大学
3.中国科学院脑科学与智能技术卓越创新中心
第一作者单位中国科学院自动化研究所
推荐引用方式
GB/T 7714
Wang, Cong,Yin, Fei,Liu, Cheng-Lin. Memory-Augmented Attention Model for Scene Text Recognition[C],2018.
条目包含的文件 下载所有文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
Memory-Augmented Att(1436KB)会议论文 开放获取CC BY-NC-SA浏览 下载
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Wang, Cong]的文章
[Yin, Fei]的文章
[Liu, Cheng-Lin]的文章
百度学术
百度学术中相似的文章
[Wang, Cong]的文章
[Yin, Fei]的文章
[Liu, Cheng-Lin]的文章
必应学术
必应学术中相似的文章
[Wang, Cong]的文章
[Yin, Fei]的文章
[Liu, Cheng-Lin]的文章
相关权益政策
暂无数据
收藏/分享
文件名: Memory-Augmented Attention Model for Scene Text Recognition.pdf
格式: Adobe PDF
此文件暂不支持浏览
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。