摘要
简要分析了最速下降法(Steepest Descent Algorithm ,即SDA 法)和正交校正共轭梯度法(CGM-OC法)的优缺点,提出了一种进行多层前馈神经网络学习的新算法,即SD-CGM-OC算法.该算法结合最速下降法与正交校正共轭梯度法的特点,在文中所述实际问题构造模型的基础上,论证了SD-CGM-OC算法比传统的BP算法具有更高的学习效率和二次收敛率.实验结果验证了该学习算法的有效性.
This paper begins with a simple analysis of the advantages and shortcomings of the Steepest Descent Algorithm (SDA) and the CGM OC method, and proposes a new learning algorithm for multilayer feedforward neural networks. The new algorithm, that is, SD CGM OC Algorithm, combines the advantages of both SDA and CGM OC method, which make use of the construction of the practical problem. It gives higher learning efficiency compared with the BP algorithm, and what is more, its convergence is quadratic. The effectiveness of the SD CGM OC Algorithm is shown by the experimental results.
出处
《西安电子科技大学学报》
EI
CAS
CSCD
北大核心
1999年第5期545-548,共4页
Journal of Xidian University
关键词
前馈神经网络
学习算法
学习效率
BP算法
multilayer feedforward neural networks
learning algorithm
SD CGM OC Algorithm