Knowledge Commons of Institute of Automation,CAS
Incremental Concept Learning via Online Generative Memory Recall | |
Li, Huaiyu1,2![]() ![]() ![]() | |
发表期刊 | IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS
![]() |
ISSN | 2162-237X |
2021-07-01 | |
卷号 | 32期号:7页码:3206-3216 |
通讯作者 | Dong, Weiming(weiming.dong@ia.ac.cn) |
摘要 | The ability to learn more concepts from incrementally arriving data over time is essential for the development of a lifelong learning system. However, deep neural networks often suffer from forgetting previously learned concepts when continually learning new concepts, which is known as the catastrophic forgetting problem. The main reason for catastrophic forgetting is that past concept data are not available, and neural weights are changed during incrementally learning new concepts. In this article, we propose an incremental concept learning framework that includes two components, namely, ICLNet and RecallNet. ICLNet, which consists of a trainable feature extractor and a dynamic concept memory matrix, aims to learn new concepts incrementally. We propose a concept-contrastive loss to alleviate the magnitude of neural weight changes and mitigate the catastrophic forgetting problems. RecallNet aims to consolidate old concepts memory and recall pseudo samples, whereas ICLNet learns new concepts. We propose a balanced online memory recall strategy to reduce the information loss of old concept memory. We evaluate the proposed approach on the MNIST, Fashion-MNIST, and SVHN data sets and compare it with other pseudorehearsal-based approaches. Extensive experiments demonstrate the effectiveness of our approach. |
关键词 | Task analysis Learning systems Neural networks Feature extraction Visualization Knowledge engineering Training Catastrophic forgetting continual learning generative adversarial networks (GANs) |
DOI | 10.1109/TNNLS.2020.3010581 |
收录类别 | SCI |
语种 | 英语 |
资助项目 | National Key Research and Development Program of China[2018AAA0101005] ; National Natural Science Foundation of China[61832016] ; National Natural Science Foundation of China[61672520] ; National Natural Science Foundation of China[61720106006] |
项目资助者 | National Key Research and Development Program of China ; National Natural Science Foundation of China |
WOS研究方向 | Computer Science ; Engineering |
WOS类目 | Computer Science, Artificial Intelligence ; Computer Science, Hardware & Architecture ; Computer Science, Theory & Methods ; Engineering, Electrical & Electronic |
WOS记录号 | WOS:000670541500033 |
出版者 | IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC |
七大方向——子方向分类 | 机器学习 |
引用统计 | |
文献类型 | 期刊论文 |
条目标识符 | http://ir.ia.ac.cn/handle/173211/45281 |
专题 | 多模态人工智能系统全国重点实验室_多媒体计算 |
通讯作者 | Dong, Weiming |
作者单位 | 1.Chinese Acad Sci, Inst Automat, Natl Lab Pattern Recognit, Beijing 100190, Peoples R China 2.Univ Chinese Acad Sci, Sch Artificial Intelligence, Beijing 100049, Peoples R China |
第一作者单位 | 模式识别国家重点实验室 |
通讯作者单位 | 模式识别国家重点实验室 |
推荐引用方式 GB/T 7714 | Li, Huaiyu,Dong, Weiming,Hu, Bao-Gang. Incremental Concept Learning via Online Generative Memory Recall[J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,2021,32(7):3206-3216. |
APA | Li, Huaiyu,Dong, Weiming,&Hu, Bao-Gang.(2021).Incremental Concept Learning via Online Generative Memory Recall.IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,32(7),3206-3216. |
MLA | Li, Huaiyu,et al."Incremental Concept Learning via Online Generative Memory Recall".IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 32.7(2021):3206-3216. |
条目包含的文件 | 条目无相关文件。 |
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论