CASIA OpenIR  > 脑图谱与类脑智能实验室  > 脑网络组研究
Continual learning of context-dependent processing in neural networks
Ceng GX(曾冠雄); Chen Y(陈阳); Cui B(崔波); Yu S(余山)
发表期刊nature machine intelligence
2019-08
期号1页码:364–372
摘要

Deep neural networks are powerful tools in learning sophisticated but fixed mapping rules between inputs and outputs, thereby limiting their application in more complex and dynamic situations in which the mapping rules are not kept the same but change according to different contexts. To lift such limits, we developed an approach involving a learning algorithm, called orthogonal weights modification, with the addition of a context-dependent processing module. We demonstrated that with orthogonal weights modification to overcome catastrophic forgetting, and the context-dependent processing module to learn how to reuse a feature representation and a classifier for different contexts, a single network could acquire numerous context-dependent mapping rules in an online and continual manner, with as few as approximately ten samples to learn each. Our approach should enable highly compact systems to gradually learn myriad regularities of the real world and eventually behave appropriately within it.

关键词Continual Learning,Context dependent Learning
文献类型期刊论文
条目标识符http://ir.ia.ac.cn/handle/173211/26152
专题脑图谱与类脑智能实验室_脑网络组研究
通讯作者Yu S(余山)
推荐引用方式
GB/T 7714
Ceng GX,Chen Y,Cui B,et al. Continual learning of context-dependent processing in neural networks[J]. nature machine intelligence,2019(1):364–372.
APA Ceng GX,Chen Y,Cui B,&Yu S.(2019).Continual learning of context-dependent processing in neural networks.nature machine intelligence(1),364–372.
MLA Ceng GX,et al."Continual learning of context-dependent processing in neural networks".nature machine intelligence .1(2019):364–372.
条目包含的文件
条目无相关文件。
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Ceng GX(曾冠雄)]的文章
[Chen Y(陈阳)]的文章
[Cui B(崔波)]的文章
百度学术
百度学术中相似的文章
[Ceng GX(曾冠雄)]的文章
[Chen Y(陈阳)]的文章
[Cui B(崔波)]的文章
必应学术
必应学术中相似的文章
[Ceng GX(曾冠雄)]的文章
[Chen Y(陈阳)]的文章
[Cui B(崔波)]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。