CASIA OpenIR  > 学术期刊  > IEEE/CAA Journal of Automatica Sinica
Squeezing More Past Knowledge for Online Class-Incremental Continual Learning
Da Yu; Mingyi Zhang; Mantian Li; Fusheng Zha; Junge Zhang; Lining Sun; Kaiqi Huang
Source PublicationIEEE/CAA Journal of Automatica Sinica
AbstractContinual learning (CL) studies the problem of learning to accumulate knowledge over time from a stream of data. A crucial challenge is that neural networks suffer from performance degradation on previously seen data, known as catastrophic forgetting, due to allowing parameter sharing. In this work, we consider a more practical online class-incremental CL setting, where the model learns new samples in an online manner and may continuously experience new classes. Moreover, prior knowledge is unavailable during training and evaluation. Existing works usually explore sample usages from a single dimension, which ignores a lot of valuable supervisory information. To better tackle the setting, we propose a novel replay-based CL method, which leverages multi-level representations produced by the intermediate process of training samples for replay and strengthens supervision to consolidate previous knowledge. Specifically, besides the previous raw samples, we store the corresponding logits and features in the memory. Furthermore, to imitate the prediction of the past model, we construct extra constraints by leveraging multi-level information stored in the memory. With the same number of samples for replay, our method can use more past knowledge to prevent interference. We conduct extensive evaluations on several popular CL datasets, and experiments show that our method consistently outperforms state-of-the-art methods with various sizes of episodic memory. We further provide a detailed analysis of these results and demonstrate that our method is more viable in practical scenarios.
Keyword Catastrophic forgetting class-incremental learning continual learning (CL) experience replay
Citation statistics
Document Type期刊论文
Collection学术期刊_IEEE/CAA Journal of Automatica Sinica
Recommended Citation
GB/T 7714
Da Yu,Mingyi Zhang,Mantian Li,et al. Squeezing More Past Knowledge for Online Class-Incremental Continual Learning[J]. IEEE/CAA Journal of Automatica Sinica,2023,10(3):722-736.
APA Da Yu.,Mingyi Zhang.,Mantian Li.,Fusheng Zha.,Junge Zhang.,...&Kaiqi Huang.(2023).Squeezing More Past Knowledge for Online Class-Incremental Continual Learning.IEEE/CAA Journal of Automatica Sinica,10(3),722-736.
MLA Da Yu,et al."Squeezing More Past Knowledge for Online Class-Incremental Continual Learning".IEEE/CAA Journal of Automatica Sinica 10.3(2023):722-736.
Files in This Item: Download All
File Name/Size DocType Version Access License
JAS-2022-1124.pdf(7599KB)期刊论文出版稿开放获取CC BY-NC-SAView Download
Related Services
Recommend this item
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Da Yu]'s Articles
[Mingyi Zhang]'s Articles
[Mantian Li]'s Articles
Baidu academic
Similar articles in Baidu academic
[Da Yu]'s Articles
[Mingyi Zhang]'s Articles
[Mantian Li]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Da Yu]'s Articles
[Mingyi Zhang]'s Articles
[Mantian Li]'s Articles
Terms of Use
No data!
Social Bookmark/Share
File name: JAS-2022-1124.pdf
Format: Adobe PDF
All comments (0)
No comment.

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.