PKD: General Distillation Framework for Object Detectors via Pearson Correlation Coefficient
Weihan, Cao1,3; Yifan, Zhang1; Jianfei, Gao2; Anda, Cheng1,3; Ke, Cheng1,3; Jian, Cheng1
2022
会议名称Advances in Neural Information Processing Systems 35 (NeurIPS 2022) Main Conference Track
卷号35
页码15394-15406
会议日期Monday November 28th through Friday December 9th
会议地点New Orleans, America
摘要

Knowledge distillation(KD) is a widely-used technique to train compact models in object detection. However, there is still a lack of study on how to distill between heterogeneous detectors. In this paper, we empirically find that better FPN features from a heterogeneous teacher detector can help the student although their detection heads and label assignments are different. However, directly aligning the feature maps to distill detectors suffers from two problems. First, the difference in feature magnitude between the teacher and the student could enforce overly strict constraints on the student. Second, the FPN stages and channels with large feature magnitude from the teacher model could dominate the gradient of distillation loss, which will overwhelm the effects of other features in KD and introduce much noise. To address the above issues, we propose to imitate features with Pearson Correlation Coefficient to focus on the relational information from the teacher and relax constraints on the magnitude of the features. Our method consistently outperforms the existing detection KD methods and works for both homogeneous and heterogeneous student-teacher pairs. Furthermore, it converges faster. With a powerful MaskRCNN-Swin detector as the teacher, ResNet-50 based RetinaNet and FCOS achieve 41.5% and 43.9% mAP on COCO2017, which are 4.1% and 4.8% higher than the baseline, respectively.

关键词Knowledge Distillation Model Compression Object Detection
学科门类工学::控制科学与工程
URL查看原文
收录类别EI
语种英语
是否为代表性论文
七大方向——子方向分类目标检测、跟踪与识别
国重实验室规划方向分类视觉信息处理
是否有论文关联数据集需要存交
文献类型会议论文
条目标识符http://ir.ia.ac.cn/handle/173211/52086
专题复杂系统认知与决策实验室_高效智能计算与学习
通讯作者Yifan, Zhang
作者单位1.NLPR & AIRIA, Institute of Automation, Chinese Academy of Sciences
2.Shanghai AI Laboratory
3.School of Artificial Intelligence, University of Chinese Academy of Sciences
第一作者单位模式识别国家重点实验室
通讯作者单位模式识别国家重点实验室
推荐引用方式
GB/T 7714
Weihan, Cao,Yifan, Zhang,Jianfei, Gao,et al. PKD: General Distillation Framework for Object Detectors via Pearson Correlation Coefficient[C],2022:15394-15406.
条目包含的文件 下载所有文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
NeurIPS-2022-pkd-gen(2614KB)会议论文 开放获取CC BY-NC-SA浏览 下载
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Weihan, Cao]的文章
[Yifan, Zhang]的文章
[Jianfei, Gao]的文章
百度学术
百度学术中相似的文章
[Weihan, Cao]的文章
[Yifan, Zhang]的文章
[Jianfei, Gao]的文章
必应学术
必应学术中相似的文章
[Weihan, Cao]的文章
[Yifan, Zhang]的文章
[Jianfei, Gao]的文章
相关权益政策
暂无数据
收藏/分享
文件名: NeurIPS-2022-pkd-general-distillation-framework-for-object-detectors-via-pearson-correlation-coefficient-Paper-Conference.pdf
格式: Adobe PDF
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。