Incremental Concept Learning via Online Generative Memory Recall
Li, Huaiyu1,2; Dong, Weiming1; Hu, Bao-Gang1
发表期刊IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS
ISSN2162-237X
2021-07-01
卷号32期号:7页码:3206-3216
通讯作者Dong, Weiming(weiming.dong@ia.ac.cn)
摘要The ability to learn more concepts from incrementally arriving data over time is essential for the development of a lifelong learning system. However, deep neural networks often suffer from forgetting previously learned concepts when continually learning new concepts, which is known as the catastrophic forgetting problem. The main reason for catastrophic forgetting is that past concept data are not available, and neural weights are changed during incrementally learning new concepts. In this article, we propose an incremental concept learning framework that includes two components, namely, ICLNet and RecallNet. ICLNet, which consists of a trainable feature extractor and a dynamic concept memory matrix, aims to learn new concepts incrementally. We propose a concept-contrastive loss to alleviate the magnitude of neural weight changes and mitigate the catastrophic forgetting problems. RecallNet aims to consolidate old concepts memory and recall pseudo samples, whereas ICLNet learns new concepts. We propose a balanced online memory recall strategy to reduce the information loss of old concept memory. We evaluate the proposed approach on the MNIST, Fashion-MNIST, and SVHN data sets and compare it with other pseudorehearsal-based approaches. Extensive experiments demonstrate the effectiveness of our approach.
关键词Task analysis Learning systems Neural networks Feature extraction Visualization Knowledge engineering Training Catastrophic forgetting continual learning generative adversarial networks (GANs)
DOI10.1109/TNNLS.2020.3010581
收录类别SCI
语种英语
资助项目National Key Research and Development Program of China[2018AAA0101005] ; National Natural Science Foundation of China[61832016] ; National Natural Science Foundation of China[61672520] ; National Natural Science Foundation of China[61720106006]
项目资助者National Key Research and Development Program of China ; National Natural Science Foundation of China
WOS研究方向Computer Science ; Engineering
WOS类目Computer Science, Artificial Intelligence ; Computer Science, Hardware & Architecture ; Computer Science, Theory & Methods ; Engineering, Electrical & Electronic
WOS记录号WOS:000670541500033
出版者IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
七大方向——子方向分类机器学习
引用统计
被引频次:13[WOS]   [WOS记录]     [WOS相关记录]
文献类型期刊论文
条目标识符http://ir.ia.ac.cn/handle/173211/45281
专题多模态人工智能系统全国重点实验室_多媒体计算
通讯作者Dong, Weiming
作者单位1.Chinese Acad Sci, Inst Automat, Natl Lab Pattern Recognit, Beijing 100190, Peoples R China
2.Univ Chinese Acad Sci, Sch Artificial Intelligence, Beijing 100049, Peoples R China
第一作者单位模式识别国家重点实验室
通讯作者单位模式识别国家重点实验室
推荐引用方式
GB/T 7714
Li, Huaiyu,Dong, Weiming,Hu, Bao-Gang. Incremental Concept Learning via Online Generative Memory Recall[J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,2021,32(7):3206-3216.
APA Li, Huaiyu,Dong, Weiming,&Hu, Bao-Gang.(2021).Incremental Concept Learning via Online Generative Memory Recall.IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,32(7),3206-3216.
MLA Li, Huaiyu,et al."Incremental Concept Learning via Online Generative Memory Recall".IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 32.7(2021):3206-3216.
条目包含的文件
条目无相关文件。
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Li, Huaiyu]的文章
[Dong, Weiming]的文章
[Hu, Bao-Gang]的文章
百度学术
百度学术中相似的文章
[Li, Huaiyu]的文章
[Dong, Weiming]的文章
[Hu, Bao-Gang]的文章
必应学术
必应学术中相似的文章
[Li, Huaiyu]的文章
[Dong, Weiming]的文章
[Hu, Bao-Gang]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。