CASIA OpenIR  > 智能感知与计算研究中心
MFC: A Multi-scale Fully Convolutional Approach for Visual Instance Retrieval
Hao, Jiedong1,2; Wang, Wei1; Dong, Jing1; Tan, Tieniu1
2017-09
会议名称2017 IEEE International Conference on Multimedia & Expo Workshops (ICMEW)
会议日期10-14 July 2017
会议地点Hong Kong
摘要Previous work has shown that feature maps of deep convolutional neural networks (CNNs) can be interpreted as feature representation of an image. Image features aggregated from these feature maps have achieved steady progress in terms of performances on visual instance retrieval tasks in recent years. The key to the success of such methods is feature representation. Inthispaper,westudyhowtorepresentanimage using discriminative features. We demonstrate first that image size is an important factor which affects the performance of instance retrieval but has not been thoroughly discussed in previous work. Based on experimental evaluations, we propose a multi-scale fully convolutional (MFC) approach to encode the image efficiently and effectively. The proposed method is simple to implement, which does not employ sophisticated post-processing techniques such as query expansion, yet shows promising results on four public datasets. 
关键词Visual Instance Retrieval Image Resizing Strategy Multi-scale Representation Fully Convolutional Neural Network
收录类别EI
语种英语
文献类型会议论文
条目标识符http://ir.ia.ac.cn/handle/173211/20992
专题智能感知与计算研究中心
通讯作者Dong, Jing
作者单位1.中国科学院自动化研究所智能感知与计算研究中心
2.中国科学院大学
第一作者单位智能感知与计算研究中心
通讯作者单位智能感知与计算研究中心
推荐引用方式
GB/T 7714
Hao, Jiedong,Wang, Wei,Dong, Jing,et al. MFC: A Multi-scale Fully Convolutional Approach for Visual Instance Retrieval[C],2017.
条目包含的文件 下载所有文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
ICME-w53.pdf(922KB)会议论文 开放获取CC BY-NC-SA浏览 下载
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Hao, Jiedong]的文章
[Wang, Wei]的文章
[Dong, Jing]的文章
百度学术
百度学术中相似的文章
[Hao, Jiedong]的文章
[Wang, Wei]的文章
[Dong, Jing]的文章
必应学术
必应学术中相似的文章
[Hao, Jiedong]的文章
[Wang, Wei]的文章
[Dong, Jing]的文章
相关权益政策
暂无数据
收藏/分享
文件名: ICME-w53.pdf
格式: Adobe PDF
此文件暂不支持浏览
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。