CASIA OpenIR  > 模式识别国家重点实验室  > 图像与视频分析
ATTENTION-GUIDED KNOWLEDGE DISTILLATION FOR EFFICIENT SINGLE-STAGE DETECTOR
Wang, Tong1,2; Zhu, Yousong1,3; Zhao, Chaoyang1; Zhao, Xu1; Wang, Jinqiao1,2,4; Tang, Ming1
2021-07
Conference NameIEEE International Conference on Multimedia & Expo(ICME)
Pages1-6
Conference Date2021-7-5
Conference PlaceOnline
Abstract

Knowledge distillation has been successfully applied in image classification for model acceleration. There are also some works employing this technique to object detection, but they all treat different feature regions equally when performing feature mimic. In this paper, we propose an end-to-end attention-guided knowledge distillation method to train efficient single-stage detectors with much smaller backbones.
More specifically, we introduce an attention mechanism to prioritize the transfer of important knowledge by focusing on a sparse set of hard samples, leading to a more thorough distillation process. In addition, the proposed distillation method also provides an easy way to train efficient detectors without tedious ImageNet pre-training procedure. Extensive experiments on PASCAL VOC and CityPersons datasets demonstrate the effectiveness of the proposed approach. We achieve 57.96% and 69.48% mAP on VOC07 with the backbone of 1/8 VGG16 and 1/4 VGG16, greatly outperforming their ImageNet pre-trained counterparts by 11.7% and 7.1% respectively.
 

Subject Area模式识别
MOST Discipline Catalogue工学::计算机科学与技术(可授工学、理学学位)
Indexed ByEI
Funding ProjectNational Natural Science Foundation of China[61806200] ; National Nature Science Foundation of China[61876086] ; National Natural Science Foundation of China[61772527] ; National Natural Science Foundation of China[61772527] ; National Nature Science Foundation of China[61876086] ; National Natural Science Foundation of China[61806200]
Language英语
Sub direction classification目标检测、跟踪与识别
Document Type会议论文
Identifierhttp://ir.ia.ac.cn/handle/173211/47417
Collection模式识别国家重点实验室_图像与视频分析
Affiliation1.National Laboratory of Pattern Recognition, Institute of Automation, Chinese Academy of Sciences, Beijing, China
2.School of Artificial Intelligence, University of Chinese Academy of Sciences, Beijing, China
3.ObjectEye Inc., Beijing, China
4.NEXWISE Co., Ltd, Guangzhou, China
First Author AffilicationChinese Acad Sci, Inst Automat, Natl Lab Pattern Recognit, Beijing 100190, Peoples R China
Recommended Citation
GB/T 7714
Wang, Tong,Zhu, Yousong,Zhao, Chaoyang,et al. ATTENTION-GUIDED KNOWLEDGE DISTILLATION FOR EFFICIENT SINGLE-STAGE DETECTOR[C],2021:1-6.
Files in This Item:
File Name/Size DocType Version Access License
ATTENTION-GUIDED KNO(574KB)会议论文 开放获取CC BY-NC-SAView
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Wang, Tong]'s Articles
[Zhu, Yousong]'s Articles
[Zhao, Chaoyang]'s Articles
Baidu academic
Similar articles in Baidu academic
[Wang, Tong]'s Articles
[Zhu, Yousong]'s Articles
[Zhao, Chaoyang]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Wang, Tong]'s Articles
[Zhu, Yousong]'s Articles
[Zhao, Chaoyang]'s Articles
Terms of Use
No data!
Social Bookmark/Share
File name: ATTENTION-GUIDED KNOWLEDGE DISTILLATION FOR EFFICIENT SINGLE-STAGE-ICME2021.pdf
Format: Adobe PDF
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.