期刊文献+

液体状态机研究进展

Research Advances in Liquid State Machine
下载PDF
导出
摘要 液体状态机(Liquid State Machine,LSM)具有实时计算和仿生的特点,在处理时间序列数据上具有巨大潜力。为了研究如何提高神经网络模型训练性能,降低计算复杂度,文章首先梳理和回顾了近几年相关研究文献,其次提出硬件实现和软件模型两个优化思路,并总结了不同优化方法的优势与不足,硬件和软件上的优化可以提高神经网络模型学习性能和训练速度,但依然存在可控性差、算法最优解未知等问题,最后针对以上问题对未来的研究方向进行了展望,可为时间序列数据处理和模式识别领域提供优化思路。 Liquid State Machine(LSM),characterized by real-time computation and biomimetics,has great potential in processing time series data.In order to study how to improve the training performance of neural network models and reduce computational complexity,relevant research literature in recent years is firstly reviewed in this paper.Then,two optimization ideas,hardware implementation and software model,are proposed,and the advantages and disadvantages of different optimization methods are summarized.Hardware and software optimization can improve the learning performance and training speed of neural network models,but there are still problems such as poor controllability and unknown algorithm optimal solutions.Finally,the future research direction is prospected for the above problems,which can provide optimization ideas for the field of time series data processing and pattern recognition.
作者 张永强 倪珊珊 宋美霖 满梦华 ZHANG Yongqiang;NI Shanshan;SONG Meilin;MAN Menghua(Shijiazhuang Campus of Army Engineering University of PLA,Shijiazhuang 050003,China;School of Information Science and Engineering,Hebei University of Science and Technology,Shijiazhuang 050018,China)
出处 《软件工程》 2023年第11期1-4,38,共5页 Software Engineering
基金 国防科技工业局国防基础科研计划(JCKYS2020DC202) 河北省自然科学基金(F2022208002) 河北省高等学校科学技术研究重点项目(ZD2021048)。
关键词 脉冲神经网络 储备池层 液体状态机 遗传算法 spiking neural network reserve pool layer Liquid State Machine genetic algorithm
  • 相关文献

参考文献3

二级参考文献74

  • 1Haykin S S.Neural Networks and Learning Machines[M].Upper Saddle River:Pearson Education, 2009.
  • 2Izhikevich E M.Which model to use for cortical spiking neurons?[J].IEEE Transactions on Neural Networks, 2004, 15(5):1063-1070.
  • 3Bohte S M.The evidence for neural information processing with precise spike-times:A survey[J].Natural Computing, 2004, 3(2):195-206.
  • 4Ghosh-Dastidar S, Adeli H.Spiking neural networks[J].International Journal of Neural Systems, 2009, 19(4):295-308.
  • 5Knudsen E I.Supervised learning in the brain[J].Journal of Neuroscience, 1994, 14(7):3985-3997.
  • 6Kasiński A, Ponulak F.Comparison of supervised learning methods for spike time coding in spiking neural networks[J].International Journal of Applied Mathematics and Computer Science, 2006, 16(1):101-113.
  • 7Quiroga R Q, Panzeri S.Principles of Neural Coding[M].Boca Raton, FL:CRC Press, 2013.
  • 8Brette R, Rudolph M, Carnevale T, et al.Simulation of networks of spiking neurons:A review of tools and strategies[J].Journal of Computational Neuroscience, 2007, 23(3):349-398.
  • 9Naud R, Gerhard F, Mensi S, et al.Improved similarity measures for small sets of spike trains[J].Neural Computation, 2011, 23(12):3016-3069.
  • 10Wang J, Belatreche A, Maguire L, et al.Online versus offline learning for spiking neural networks:A review and new strategies[A].Proceedings of the 9th International Conference on Cybernetic Intelligent Systems[C].London, UK:IEEE, 2010.1-6.

共引文献31

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部