期刊文献+

具有变时滞和马尔可夫切换的随机递归神经网络的弱收敛(英文)

Weak Convergence of Stochastic Recurrent Neural Networks with Markovian Switching and Time-varying Delay
下载PDF
导出
摘要 本文研究了具有变时滞和马尔可夫切换的随机递归神经网络的弱收敛,通过运用Lyapunov函数、随机分析技巧和推广了的Halanay不等式,得到了上述模型为弱收敛的充分性条件,并且我们揭示了对上述递归神经网络模型所确定的segment过程的转移概率的极限分布是此模型的解过程的唯一的遍历不变概率测度.此外,我们还给出了例子和数值模拟来说明我们结论的正确性. This paper is concerned with the weak convergence of stochastic recurrent neural networks with Markovian switching and time- varying delay. By using Lyapunov function,stochastic analysis technique and generalized Halanay inequality,some sufficient conditions on weak convergence are given. For such recurrent neural networks,it reveals that the limit distribution of transition probability for segment process associated with solution process is indeed a unique ergodic invariant probability measure. Moreover,an example and numerical simulation are provided to demonstrate the effectiveness and applicability of the theoretical results.
出处 《数学理论与应用》 2015年第1期31-49,共19页 Mathematical Theory and Applications
基金 supported in part by the National Natural Science Foundation of China under Grants no. 11101054 , 11101434 Hunan Provincial Natural Science Foundation of China under Grant no. 12JJ4005 the Scientific Research Funds of Hunan Provincial Science and Technology Department of China under Grants no. 2013FJ4035 Humanities and Social Sciences Foundation of Ministry of Education of China under Grants no. 12YJAZH173 the National Innovation Foundation for Undergraduate no. 201410536011 Changsha University of Science and Technology Innovation Foundation for Postgraduate no. CX2014YB19
关键词 弱收敛 变时滞 随机递归神经网络 马尔可夫切换 布朗运动 Weak convergence Time-varying delay Stochastic recurrent neural network Markovian switching Brownian motion
  • 相关文献

参考文献22

  • 1Williams R J, Zipser D. A learning algorithm for continually running fully recurrent neural networks, leural computation, 1989, 1 (2) : 270 -280.
  • 2Lin C T, Lee C S G. Neural - network - based fuzzy logic control and decision system. IEEE Transactions on Computers, 1991, 40(12) : 1320 - 1336.
  • 3Arik S. Stability analysis of delayed neural networks. IEEE Transactions on Circuits and Systems I : Fundamen- tal Theory and Applications, 2000, 47 (7) : 1089 - 1092.
  • 4Haykin S S. Neural networks: a comprehensive foundation. Englewood Cliffs, N J: Prentice Hall, 2007.
  • 5Liu Y, Wmag Z, Liu X. Global exponential stability of generalized recurrent neural networks with discrete and distributed delays. Neural Networks, 2006, 19(5) : 667 - 675.
  • 6Cao J, Wang J. Global exponential stability and periodicity of recurrent neural networks with time delays. IEEE Transactions on Circuits and Systems I: Regular Papers, 2005, 52(5) : 920-931.
  • 7Wang Z, Liu Y, Yu L, et al. Exponential stability of delayed recurrent neural networks with Markovian jumping parameters. Physics Letters A, 2006, 356(4) : 346 -352.
  • 8Wang Z, Liu Y, Liu X. State estimation for jumping recurrent neural networks with discrete and distributed de- lays. Neural Networks, 2009, 22( 1): 41-48.
  • 9Huang C, He Y, Wang H. Mean square exponential stability of stochastic recurrent neural networks with time - varying delays. Computers & Mathematics with Applications, 2008, 56(7 ) : 1773 -1778.
  • 10Song Q. Exponential stability of recurrent neural networks with both time - varying delays and general activation functions via LMI approach. Neurocomputing, 2008, 71 (13) : 2823 - 2830.

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部