摘要
SNN因其在神经形态芯片中的高能效优势而受到广泛关注。ANN-SNN转换是实现深度SNN的主流方法之一,但在极低延迟下,死神经元脉冲误差导致目标SNN与源ANN之间存在性能差距。为解决死神经元脉冲误差,实现高性能低延迟SNN,本文提出了一种分区间移位激活函数,用于替代传统的ReLU激活函数。实验结果表明,在CIFAR-10数据集上,本文方法得到的SNN仅需4个时间步即可达到94.78%的Top-1准确率。
SNN has received widespread attention due to its high energy efficiency advantage in neuromorphic chips.ANN-SNN conversion is one of the mainstream methods for implementing deep SNN,but at extremely low latency,the error of the Reaper Meridian pulse leads to a performance gap between the target SNN and the source ANN.To solve the problem of dead neuron pulse error and achieve high-performance low latency SNN,this paper proposes an inter partition shift activation function to replace the traditional ReLU activation function.The experimental results show that on the CIFAR-10 dataset,the SNN obtained by our method can achieve a Top-1 accuracy of 94.78%in only 4 time steps.
作者
黄志鹏
HUANG Zhipeng(College of Computer and Cyber Security,Fujian Normal University,Fuzhou,China,350117)
出处
《福建电脑》
2024年第12期8-13,共6页
Journal of Fujian Computer
基金
福建省自然科学基金面上项目(No.2022J01656)资助。