摘要
从优化计算模型的角度研究延迟离散Hopfield网络,提出了它的一般演化规则(GURD).在任何演化序列下,证明了GURD单调收敛到网络的稳定状态,已有的延迟神经网络的演化算法为其特例,同时GURD的收敛条件已经突破了已有的限制,有更大的应用空间.新的演化规则和收敛条件保证了延迟神经网络收敛到能量函数的局部最大值.
In the paper, the Hopfield neural network with delay is studied from the standpoint of taking it as optimization computational model. A generallied updating rule for the network (GURD) is established. It is proved that in any sequence of updating rule modes , the GURD nonotonously converges to a stable state of the network. All ordinary HNND algorithms are instances of the GURD. It is shown that the convergence conditions of GURD expand into more loose conditions in applications and break through original constrained conditions. New updating nile mode and restrictive conditions can guarantee the network to achieve local maximum of the energy function.
出处
《中山大学学报(自然科学版)》
CAS
CSCD
北大核心
2000年第S2期285-289,共5页
Acta Scientiarum Naturalium Universitatis Sunyatseni
基金
This work was supported in part by the national nature sciences foundation and nature sciences foundation of Guangdong provinc
关键词
神经网络
延迟
演化规则
优化问题
收敛性
neural network
updating rule
optimization problem
convergence