Densely Connected Attention Flow for Visual Question Answering
Liu, Fei1,2; Liu, Jing1,2; Fang, Zhiwei1,2; Hong, Richang3
2019
会议名称International Joint Conference on Artificial Intelligence (IJCAI)
会议日期2019-8
会议地点中国澳门
出版者IJCAI
摘要

Learning effective interactions between multimodal features is at the heart of visual question answering (VQA). A common defect of the existing VQA approaches is that they only consider a very limited amount of interactions, which may be not enough to model latent complex imagequestion relations that are necessary for accurately answering questions. Therefore, in this paper, we propose a novel DCAF (Densely Connected Attention Flow) framework for modeling dense interactions. It densely connects all pairwise layers of the network via Attention Connectors, capturing fine-grained interplay between image and question across all hierarchical levels. The proposed Attention Connector efficiently connects the multi-modal features at any two layers with symmetric co-attention, and produces interaction-aware attention features. Experimental results on three publicly available datasets show that the proposed method achieves state-of-the-art performance.

语种英语
文献类型会议论文
条目标识符http://ir.ia.ac.cn/handle/173211/48557
专题紫东太初大模型研究中心_图像与视频分析
通讯作者Liu, Jing
作者单位1.National Laboratory of Pattern Recognition, Institute of Automation, Chinese Academy of Sciences
2.University of Chinese Academy of Sciences
3.School of Computer and Information, Hefei University of Technology
第一作者单位模式识别国家重点实验室
通讯作者单位模式识别国家重点实验室
推荐引用方式
GB/T 7714
Liu, Fei,Liu, Jing,Fang, Zhiwei,et al. Densely Connected Attention Flow for Visual Question Answering[C]:IJCAI,2019.
条目包含的文件 下载所有文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
0122.pdf(681KB)会议论文 开放获取CC BY-NC-SA浏览 下载
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Liu, Fei]的文章
[Liu, Jing]的文章
[Fang, Zhiwei]的文章
百度学术
百度学术中相似的文章
[Liu, Fei]的文章
[Liu, Jing]的文章
[Fang, Zhiwei]的文章
必应学术
必应学术中相似的文章
[Liu, Fei]的文章
[Liu, Jing]的文章
[Fang, Zhiwei]的文章
相关权益政策
暂无数据
收藏/分享
文件名: 0122.pdf
格式: Adobe PDF
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。