摘要
神经网络中与联想记忆密切相关的两个动力学性质是网络的稳定性和吸引性.针对高阶连接网络,首先讨论了通常所用的Hebb规则下网络的稳定性和吸引性问题.由于Hebb规则对于正交或近似正交的原型模式才有较好的联想记忆能力,故文中又针对一般线性无关模式,给出了保证这些模式为网络稳定平衡点的高阶连接权的秩1张量形式,该形式可看作高阶连接的伪逆(投影)规则.通过对网络稳定性和吸引性的分析,得到一些充分条件。
In neural networks two dynamical characters acting in close coordination with associative memory are the stability and attractability.Considering higher order correlations,we initially disscussed the problem of stability and attractability of the networks under the use of usual Hebbs rule.The Hebbs rule is better for orthogonal or nearly orthogonal patterns,so we give another form of learning rule for general linear independent prototypes.This rule is about the higher order correlation weights of rank one tensor form,which can be regarded as higher order pseudo inverse or projection rule.After analyzing the networks stabiity and attractability,we yield some sufficient conditions which are of importance to direct the design and sythesis of higher order neural networks.
出处
《西安交通大学学报》
EI
CAS
CSCD
北大核心
1996年第8期96-103,共8页
Journal of Xi'an Jiaotong University
基金
国家自然科学基金
西安交通大学科研基金资助项目
关键词
神经网络
联想记忆
连接
张量描述
动力学
neural networks associative memory higher order correlation tensor of rank one