期刊文献+

通过分区间移位实现高效ANN-SNN转换

Efficient ANN-SNN Conversion by Interval Shifting
下载PDF
导出
摘要 SNN因其在神经形态芯片中的高能效优势而受到广泛关注。ANN-SNN转换是实现深度SNN的主流方法之一,但在极低延迟下,死神经元脉冲误差导致目标SNN与源ANN之间存在性能差距。为解决死神经元脉冲误差,实现高性能低延迟SNN,本文提出了一种分区间移位激活函数,用于替代传统的ReLU激活函数。实验结果表明,在CIFAR-10数据集上,本文方法得到的SNN仅需4个时间步即可达到94.78%的Top-1准确率。 SNN has received widespread attention due to its high energy efficiency advantage in neuromorphic chips.ANN-SNN conversion is one of the mainstream methods for implementing deep SNN,but at extremely low latency,the error of the Reaper Meridian pulse leads to a performance gap between the target SNN and the source ANN.To solve the problem of dead neuron pulse error and achieve high-performance low latency SNN,this paper proposes an inter partition shift activation function to replace the traditional ReLU activation function.The experimental results show that on the CIFAR-10 dataset,the SNN obtained by our method can achieve a Top-1 accuracy of 94.78%in only 4 time steps.
作者 黄志鹏 HUANG Zhipeng(College of Computer and Cyber Security,Fujian Normal University,Fuzhou,China,350117)
出处 《福建电脑》 2024年第12期8-13,共6页 Journal of Fujian Computer
基金 福建省自然科学基金面上项目(No.2022J01656)资助。
关键词 脉冲神经网络 ANN-SNN转换 分区间移位 ReLU激活函数 Spiking Neural Networks ANN-SNN Conversion Inter Partition Shift ReLU Activation Function
  • 相关文献

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部