Knowledge Commons of Institute of Automation,CAS
Temporal Knowledge Sharing enable Spiking Neural Network Learning from Past and Future | |
Dong, Yiting; Zhao, Dongcheng; Zeng, Yi | |
发表期刊 | IEEE Transactions on Artificial Intelligence |
2024 | |
页码 | 1-10 |
摘要 | Spiking Neural Networks (SNNs) have attracted significant attention from researchers across various domains due to their brain-inspired information processing mechanism. However, SNNs typically grapple with challenges such as extended time steps, low temporal information utilization, and the requirement for consistent time step between testing and training. These challenges render SNNs with high latency. Moreover, the constraint on time steps necessitates the retraining of the model for new deployments, reducing adaptability. To address these issues, this paper proposed a novel perspective, viewing the SNN as a temporal aggregation model. We introduced the Temporal Knowledge Sharing (TKS) method, facilitating information interact between different time points. TKS can be perceived as a form of temporal self-distillation. To validate the efficacy of TKS in information processing, we tested it on static datasets like CIFAR10, CIFAR100, ImageNet-1k, and neuromorphic datasets such as DVS-CIFAR10 and NCALTECH101. Experimental results demonstrated that our method achieves state-of-the-art performance compared to other algorithms. Furthermore, TKS addresses the temporal consistency challenge, endowing the model with superior temporal generalization capabilities. This allows the network to train with longer time steps and maintain high performance during testing with shorter time steps. Such an approach considerably accelerates the deployment of SNNs on edge devices. Finally, we conducted ablation experiments and tested TKS on fine-grained tasks, with results showcasing TKS’s enhanced capability to process information efficiently. |
学科门类 | 工学::控制科学与工程 |
DOI | 10.1109/TAI.2024.3374268 |
URL | 查看原文 |
收录类别 | 其他 |
语种 | 英语 |
七大方向——子方向分类 | 类脑模型与计算 |
国重实验室规划方向分类 | 认知机理与类脑学习 |
是否有论文关联数据集需要存交 | 否 |
引用统计 | |
文献类型 | 期刊论文 |
条目标识符 | http://ir.ia.ac.cn/handle/173211/57258 |
专题 | 脑图谱与类脑智能实验室_类脑认知计算 |
通讯作者 | Zeng, Yi |
推荐引用方式 GB/T 7714 | Dong, Yiting,Zhao, Dongcheng,Zeng, Yi. Temporal Knowledge Sharing enable Spiking Neural Network Learning from Past and Future[J]. IEEE Transactions on Artificial Intelligence,2024:1-10. |
APA | Dong, Yiting,Zhao, Dongcheng,&Zeng, Yi.(2024).Temporal Knowledge Sharing enable Spiking Neural Network Learning from Past and Future.IEEE Transactions on Artificial Intelligence,1-10. |
MLA | Dong, Yiting,et al."Temporal Knowledge Sharing enable Spiking Neural Network Learning from Past and Future".IEEE Transactions on Artificial Intelligence (2024):1-10. |
条目包含的文件 | 下载所有文件 | |||||
文件名称/大小 | 文献类型 | 版本类型 | 开放类型 | 使用许可 | ||
TAI_TKS.pdf(8822KB) | 期刊论文 | 作者接受稿 | 开放获取 | CC BY-NC-SA | 浏览 下载 |
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论