Knowledge Commons of Institute of Automation,CAS
PKD: General Distillation Framework for Object Detectors via Pearson Correlation Coefficient | |
Weihan, Cao1,3; Yifan, Zhang1; Jianfei, Gao2; Anda, Cheng1,3; Ke, Cheng1,3; Jian, Cheng1 | |
2022 | |
会议名称 | Advances in Neural Information Processing Systems 35 (NeurIPS 2022) Main Conference Track |
卷号 | 35 |
页码 | 15394-15406 |
会议日期 | Monday November 28th through Friday December 9th |
会议地点 | New Orleans, America |
摘要 | Knowledge distillation(KD) is a widely-used technique to train compact models in object detection. However, there is still a lack of study on how to distill between heterogeneous detectors. In this paper, we empirically find that better FPN features from a heterogeneous teacher detector can help the student although their detection heads and label assignments are different. However, directly aligning the feature maps to distill detectors suffers from two problems. First, the difference in feature magnitude between the teacher and the student could enforce overly strict constraints on the student. Second, the FPN stages and channels with large feature magnitude from the teacher model could dominate the gradient of distillation loss, which will overwhelm the effects of other features in KD and introduce much noise. To address the above issues, we propose to imitate features with Pearson Correlation Coefficient to focus on the relational information from the teacher and relax constraints on the magnitude of the features. Our method consistently outperforms the existing detection KD methods and works for both homogeneous and heterogeneous student-teacher pairs. Furthermore, it converges faster. With a powerful MaskRCNN-Swin detector as the teacher, ResNet-50 based RetinaNet and FCOS achieve 41.5% and 43.9% mAP on COCO2017, which are 4.1% and 4.8% higher than the baseline, respectively. |
关键词 | Knowledge Distillation Model Compression Object Detection |
学科门类 | 工学::控制科学与工程 |
URL | 查看原文 |
收录类别 | EI |
语种 | 英语 |
是否为代表性论文 | 是 |
七大方向——子方向分类 | 目标检测、跟踪与识别 |
国重实验室规划方向分类 | 视觉信息处理 |
是否有论文关联数据集需要存交 | 否 |
文献类型 | 会议论文 |
条目标识符 | http://ir.ia.ac.cn/handle/173211/52086 |
专题 | 复杂系统认知与决策实验室_高效智能计算与学习 |
通讯作者 | Yifan, Zhang |
作者单位 | 1.NLPR & AIRIA, Institute of Automation, Chinese Academy of Sciences 2.Shanghai AI Laboratory 3.School of Artificial Intelligence, University of Chinese Academy of Sciences |
第一作者单位 | 模式识别国家重点实验室 |
通讯作者单位 | 模式识别国家重点实验室 |
推荐引用方式 GB/T 7714 | Weihan, Cao,Yifan, Zhang,Jianfei, Gao,et al. PKD: General Distillation Framework for Object Detectors via Pearson Correlation Coefficient[C],2022:15394-15406. |
条目包含的文件 | 下载所有文件 | |||||
文件名称/大小 | 文献类型 | 版本类型 | 开放类型 | 使用许可 | ||
NeurIPS-2022-pkd-gen(2614KB) | 会议论文 | 开放获取 | CC BY-NC-SA | 浏览 下载 |
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论