摘要
鉴于脉冲神经网络在连续学习过程中会产生灾难性遗忘,以往的无监督方法需要足够大的网络规模才能达到良好的学习效果,但这将耗费大量的计算时间和资源,本研究旨在用较小的网络规模实现较好的连续学习效果.受神经科学的启发,提出了一个网络分割与已有自适应突触可塑性相结合的方法:首先把网络分割为与任务数相同且互不重叠的子网络,然后每个子网络通过自适应突触可塑性进行无监督学习.该方法易于实现,仅须很少的计算资源,并且允许小规模脉冲神经网络在多任务连续学习时保持高性能.研究结果表明:在小规模的脉冲神经网络上,采用网络分割方法比不采用网络分割的测试精度明显提高.对于4个数据集,平均提高约36%.
Spiking neural networks will suffers fromcatastrophic forgetting in the process of continual learning.Existing unsupervised methods need a large enough network size to achieve good learning results,but it will consume a lot of computing time and resources.This paper aims to achieve better continual learning effect with a small network scale.Inspired by neuroscience,a method of combining split network with existing adaptive synaptic plasticity was proposed:firstly,the network was divided into subnetworks with the same number of tasks and without overlapping each other,and then each subnetwork performed unsupervised learning through adaptive synaptic plasticity.This method is easy to implement,requires little computationaloverhead,and allows small-scale spiking neural networks to maintain high performance across large numbers of sequentially presented tasks.The results show a significantly higher test accuracy with split network than without the split network method on the small-scale spiking neural networks.The average increase is about 40%for the four datasets.
作者
陈焕文
CHEN Huanwen(School of Automation,Central South University,Changsha 410083,China)
出处
《华中科技大学学报(自然科学版)》
EI
CAS
CSCD
北大核心
2024年第3期156-160,共5页
Journal of Huazhong University of Science and Technology(Natural Science Edition)
基金
国家自然科学基金资助项目(52172169)
湖南省自然科学基金资助项目(2021JJ30863)。
关键词
网络分割
脉冲神经网络
连续学习
灾难性遗忘
无监督
network split
spiking neural networks
continual learning
catastrophic forgetting
unsupervised