CASIA OpenIR  > 脑图谱与类脑智能实验室  > 脑网络组研究
CKDF: Cascaded Knowledge Distillation Framework for Robust Incremental Learning
Li KC(李焜炽); Wan J(万军); Yu S(余山)
发表期刊IEEE TRANSACTIONS ON IMAGE PROCESSING
2022
卷号31页码:3825–3837
摘要

Abstract— Recently, owing to the superior performances,
knowledge distillation-based (kd-based) methods with the exemplar rehearsal have been widely applied in class incremental
learning (CIL). However, we discover that they suffer from
the feature uncalibration problem, which is caused by directly
transferring knowledge from the old model immediately to the
new model when learning a new task. As the old model confuses
the feature representations between the learned and new classes,
the kd loss and the classification loss used in kd-based methods
are heterogeneous. This is detrimental if we learn the existing
knowledge from the old model directly in the way as in typical
kd-based methods. To tackle this problem, the feature calibration
network (FCN) is proposed, which is used to calibrate the existing
knowledge to alleviate the feature representation confusion of the
old model. In addition, to relieve the task-recency bias of FCN
caused by the limited storage memory in CIL, we propose a novel
image-feature hybrid sample rehearsal strategy to train FCN
by splitting the memory budget to store the image-and-feature
exemplars of the previous tasks. As feature embeddings of images
have much lower-dimensions, this allows us to store more samples
to train FCN. Based on these two improvements, we propose the
Cascaded Knowledge Distillation Framework (CKDF) including
three main stages. The first stage is used to train FCN to
calibrate the existing knowledge of the old model. Then, the new
model is trained simultaneously by transferring knowledge from
the calibrated teacher model through the knowledge distillation
strategy and learning new classes. Finally, after completing the
new task learning, the feature exemplars of previous tasks are
updated. Importantly, we demonstrate that the proposed CKDF
is a general framework that can be applied to various kd-based
methods. Experimental results show that our method achieves
state-of-the-art performances on several CIL benchmarks.

收录类别SCI
语种英语
七大方向——子方向分类机器学习
国重实验室规划方向分类认知机理与类脑学习
是否有论文关联数据集需要存交
文献类型期刊论文
条目标识符http://ir.ia.ac.cn/handle/173211/56635
专题脑图谱与类脑智能实验室_脑网络组研究
通讯作者Yu S(余山)
作者单位1.Institute of Automation, Chinese Academy of Sciences
2.the School of Artificial Intelligence, University of Chinese Academy of Sciences
推荐引用方式
GB/T 7714
Li KC,Wan J,Yu S. CKDF: Cascaded Knowledge Distillation Framework for Robust Incremental Learning[J]. IEEE TRANSACTIONS ON IMAGE PROCESSING,2022,31:3825–3837.
APA Li KC,Wan J,&Yu S.(2022).CKDF: Cascaded Knowledge Distillation Framework for Robust Incremental Learning.IEEE TRANSACTIONS ON IMAGE PROCESSING,31,3825–3837.
MLA Li KC,et al."CKDF: Cascaded Knowledge Distillation Framework for Robust Incremental Learning".IEEE TRANSACTIONS ON IMAGE PROCESSING 31(2022):3825–3837.
条目包含的文件 下载所有文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
CKDF_Cascaded_Knowle(3813KB)期刊论文作者接受稿开放获取CC BY-NC-SA浏览 下载
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Li KC(李焜炽)]的文章
[Wan J(万军)]的文章
[Yu S(余山)]的文章
百度学术
百度学术中相似的文章
[Li KC(李焜炽)]的文章
[Wan J(万军)]的文章
[Yu S(余山)]的文章
必应学术
必应学术中相似的文章
[Li KC(李焜炽)]的文章
[Wan J(万军)]的文章
[Yu S(余山)]的文章
相关权益政策
暂无数据
收藏/分享
文件名: CKDF_Cascaded_Knowledge_Distillation_Framework_for_Robust_Incremental_Learning.pdf
格式: Adobe PDF
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。