摘要
首次利用变办法研究运算放大器增益趋于无穷时Hopfield连续神经网络的稳定性。通过研究能量函数的变分,文中无需求解非线性微分方程便可得到A.N.Michel等人所得到的稳定性定理。文中的结果表明:当运算放大器增益趋于无穷时,Hopfield连续神经网络和离散神经网络具有非常相似的稳定特性,特别是,这两种神经网络具有相同的稳态集合。
It is the first time that variation method is adopted to study the stability of Hopfield's continuous-time neural network with an infinite amplifier gain. By varying the energy function, the stabilily theorems obtained by A. N. Michael et al are simply proved with- out solving nonlinear differential equations. Results in. the paper show that the two neural network models proposed by Hopfield have similar stability properties, and especially that they have the same set of stable states when the gain of amplifiers approaches infinity.
出处
《西安电子科技大学学报》
EI
CAS
CSCD
北大核心
1992年第3期28-35,共8页
Journal of Xidian University
基金
国家自然科学基金
关键词
神经网络
稳定性
稳态集
neural network
stability
stable states