期刊文献+

储备池计算概述 被引量:21

Survey on Reservoir Computing
下载PDF
导出
摘要 针对传统递归神经网络存在训练困难的问题,一种新的递归神经网络的训练方法———储备池计算被提出,这种方法的核心思想是只训练网络部分连接权,其余连接权一经产生就不再改变,网络的训练一般只需要通过求解线性回归问题.广义地说,储备池可以作为一种时序相关的核函数使用,从而完全拓展了其应用领域,使之不再仅仅是递归神经网络训练算法的一种改进.本文在介绍储备池计算基本数学模型的基础上,从储备池计算研究的热点问题——储备池适应性问题的角度,全面地分析了目前储备池计算的研究现状、热点及应用等方面的问题. A novel training method for recurrent neural networks,which is called reservoir computing,was proposed with the purpose of dealing with difficulties in the training of the traditional recurrent neural networks.The main idea of the reservoir computing is training only parts of the connection weights of the networks,and generating the rest parts randomly.The connection weights generated randomly remain unchanged during the training process.Then training process of the network can be carried out by solving a linear regression problem.The reservoir can be considered as a temporal kernel function which extends the applications of the reservoir computing.In fact,the reservoir computing is not only a modification of the training algorithm to recurrent neural networks.In this paper,we firstly introduce the mathematical model of the reservoir computing and analyze the current related researches and applications in detail in the view of reservoir adaption which has attracted much interest of the researchers recently.
出处 《电子学报》 EI CAS CSCD 北大核心 2011年第10期2387-2396,共10页 Acta Electronica Sinica
基金 教育部新世纪优秀人才支持计划(No.NCET-10-0062) 教育部高等学校博士学科点专项基金(No.20092302220013)
关键词 机器学习 递归神经网络 储备池计算 回声状态网络 machine learning recurrent neural network reservoir computing echo state networks
  • 相关文献

参考文献70

  • 1Herbert Jaeger. The "Echo State" Approach to Analyzing and Training Recurrent Neural Network[R]. Bremen: GMD Report 148, GMD-German National Research Institute for Computer Science, 2001.
  • 2Wolfgang Maass, T Natschlager, M H. Real-time computing without stable states:A new framework for neural computation based on pertttrbations[ J] .Neural Computation, 2(D2,14( 11 ) : 2531 - 2560.
  • 3D Verstraeten, B Schrauwen, M D' Haene, D Stroobandt. An experimental unification of reservoir computing methods [ J ]. Neural Networks, 2007,20(3 ) : 391 - 403.
  • 4U D Schiller, J J Steil. Analyzing the weight dynamics of recurrent learning algorithm[ J]. Neurocomputing, 2005,63 ( 1 ) : 5 - 23.
  • 5Jaeger H. Reservoir Computing: Shaping Dynamics into Informarion [ DB/OL ] http://reservoir-compuling, org/, 2009-03- 03.
  • 6Jochen J Steil. Backpropagation-decorrrelation: online recurrent learning with 0 (N) complexity [ A ]. International Joint Con- ference on Neural Networks-UCNN 2009 [ C ]. Piseataway, NJ 08855-1331,2004. 843 - 848.
  • 7Dominey P F. Complex sensory-motor sequence learning based on recurrent state representation and reinforcement learning[ J]. Biological Cybernetics, 1995,73(3) :265 - 274.
  • 8Mantas Lukosevicius, Herbert Jaeger. Reservoir computing ap- proaches to recurrent neural network Waining[ J]. Computer Sci- ence Review,2009,3(3 ) :127 - 149.
  • 9Benjamin Schrauwen, Marion Wardermann, David Verstraeten, Jochen J Steil. Improving reservoirs using intrinsic plasticity[J]. Neurocompufing, 2007,71( 7-9 ) : 1159 - 1171.
  • 10Zhiwei Shi, Min Han. Support vector echo-state machine for chaotic time series prediction[ J]. IEEE Transactions on Neu- ral Networks,2007,18(2) :359 - 372.

二级参考文献93

共引文献105

同被引文献213

引证文献21

二级引证文献71

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部