One-stage object detection knowledge distillation via adversarial learning
Dong, Na1; Zhang, Yongqiang1; Ding, Mingli1; Xu, Shibiao2; Bai, Yancheng3
发表期刊APPLIED INTELLIGENCE
ISSN0924-669X
2021-07-24
页码17
通讯作者Zhang, Yongqiang(yongqiang.zhang.hit@gmail.com)
摘要Impressive methods for object detection tasks have been proposed based on convolutional neural networks (CNNs), however, they usually use very computation expensive deep networks to obtain such significant performance. Knowledge distillation has attracted much attention in the task of image classification lately since it can use compact models that reduce computations while preserving performance. Moreover, the best performing deep neural networks often assemble the outputs of multiple networks in an average way. However, the memory required to store these networks, and the time required to execute them in inference, which prohibits these methods used in real-time applications. In this paper, we present a knowledge distillation method for one-stage object detection, which can assemble a variety of large, complex trained networks into a lightweight network. In order to transfer diverse knowledge from various trained one-stage object detection networks, an adversarial-based learning strategy is employed as supervision to guide and optimize the lightweight student network to recover the knowledge of teacher networks, and to enable the discriminator module to distinguish the feature of teacher and student simultaneously. The proposed method exhibits two predominant advantages: (1) The lightweight student model can learn the knowledge of the teacher, which contains richer discriminative information than the model trained from scratch. (2) Faster inference speed than traditional ensemble methods from multiple networks is realized. A large number of experiments are carried out on PASCAL VOC and MS COCO datasets to verify the effectiveness of the proposed method for one-stage object detection, which obtains 3.43%, 2.48%, and 5.78% mAP promotions for vgg11-ssd, mobilenetv1-ssd-lite and mobilenetv2-ssd-lite student network on the PASCAL VOC 2007 dataset, respectively. Furthermore, with multi-teacher ensemble method, vgg11-ssd gains 7.10% improvement, which is remarkable.
关键词Knowledge distillation Object detection Generative adversarial learning
DOI10.1007/s10489-021-02634-6
关键词[WOS]DEEP ; NETWORK ; FUSION
收录类别SCI
语种英语
资助项目China Postdoctoral Science Foundation[259822]
项目资助者China Postdoctoral Science Foundation
WOS研究方向Computer Science
WOS类目Computer Science, Artificial Intelligence
WOS记录号WOS:000677240200002
出版者SPRINGER
引用统计
被引频次:6[WOS]   [WOS记录]     [WOS相关记录]
文献类型期刊论文
条目标识符http://ir.ia.ac.cn/handle/173211/45570
专题多模态人工智能系统全国重点实验室_三维可视计算
通讯作者Zhang, Yongqiang
作者单位1.Harbin Inst Technol, Sch Instrument Sci & Engn, Harbin, Peoples R China
2.Chinese Acad Sci, Inst Automat, Beijing, Peoples R China
3.Chinese Acad Sci, Inst Software, Beijing, Peoples R China
推荐引用方式
GB/T 7714
Dong, Na,Zhang, Yongqiang,Ding, Mingli,et al. One-stage object detection knowledge distillation via adversarial learning[J]. APPLIED INTELLIGENCE,2021:17.
APA Dong, Na,Zhang, Yongqiang,Ding, Mingli,Xu, Shibiao,&Bai, Yancheng.(2021).One-stage object detection knowledge distillation via adversarial learning.APPLIED INTELLIGENCE,17.
MLA Dong, Na,et al."One-stage object detection knowledge distillation via adversarial learning".APPLIED INTELLIGENCE (2021):17.
条目包含的文件
条目无相关文件。
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Dong, Na]的文章
[Zhang, Yongqiang]的文章
[Ding, Mingli]的文章
百度学术
百度学术中相似的文章
[Dong, Na]的文章
[Zhang, Yongqiang]的文章
[Ding, Mingli]的文章
必应学术
必应学术中相似的文章
[Dong, Na]的文章
[Zhang, Yongqiang]的文章
[Ding, Mingli]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。