Knowledge Commons of Institute of Automation,CAS
Squeezing More Past Knowledge for Online Class-Incremental Continual Learning | |
Da Yu; Mingyi Zhang![]() ![]() ![]() | |
Source Publication | IEEE/CAA Journal of Automatica Sinica
![]() |
ISSN | 2329-9266 |
2023 | |
Volume | 10Issue:3Pages:722-736 |
Abstract | Continual learning (CL) studies the problem of learning to accumulate knowledge over time from a stream of data. A crucial challenge is that neural networks suffer from performance degradation on previously seen data, known as catastrophic forgetting, due to allowing parameter sharing. In this work, we consider a more practical online class-incremental CL setting, where the model learns new samples in an online manner and may continuously experience new classes. Moreover, prior knowledge is unavailable during training and evaluation. Existing works usually explore sample usages from a single dimension, which ignores a lot of valuable supervisory information. To better tackle the setting, we propose a novel replay-based CL method, which leverages multi-level representations produced by the intermediate process of training samples for replay and strengthens supervision to consolidate previous knowledge. Specifically, besides the previous raw samples, we store the corresponding logits and features in the memory. Furthermore, to imitate the prediction of the past model, we construct extra constraints by leveraging multi-level information stored in the memory. With the same number of samples for replay, our method can use more past knowledge to prevent interference. We conduct extensive evaluations on several popular CL datasets, and experiments show that our method consistently outperforms state-of-the-art methods with various sizes of episodic memory. We further provide a detailed analysis of these results and demonstrate that our method is more viable in practical scenarios. |
Keyword | |
DOI | 10.1109/JAS.2023.123090 |
Citation statistics | |
Document Type | 期刊论文 |
Identifier | http://ir.ia.ac.cn/handle/173211/51184 |
Collection | 学术期刊_IEEE/CAA Journal of Automatica Sinica |
Recommended Citation GB/T 7714 | Da Yu,Mingyi Zhang,Mantian Li,et al. Squeezing More Past Knowledge for Online Class-Incremental Continual Learning[J]. IEEE/CAA Journal of Automatica Sinica,2023,10(3):722-736. |
APA | Da Yu.,Mingyi Zhang.,Mantian Li.,Fusheng Zha.,Junge Zhang.,...&Kaiqi Huang.(2023).Squeezing More Past Knowledge for Online Class-Incremental Continual Learning.IEEE/CAA Journal of Automatica Sinica,10(3),722-736. |
MLA | Da Yu,et al."Squeezing More Past Knowledge for Online Class-Incremental Continual Learning".IEEE/CAA Journal of Automatica Sinica 10.3(2023):722-736. |
Files in This Item: | Download All | |||||
File Name/Size | DocType | Version | Access | License | ||
JAS-2022-1124.pdf(7599KB) | 期刊论文 | 出版稿 | 开放获取 | CC BY-NC-SA | View Download |
Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.
Edit Comment