期刊文献+

前馈型神经网络新学习算法的研究 被引量:40

Research of new learning method of feedforward neural network
原文传递
导出
摘要 前馈神经网络在非线性系统的建模及控制中有着广阔的应用前景,但是该网络的学习算法—向后传播算法(Backpropagation(BP)Algorithm)算法存在一些不足。为了提高多层前馈神经网络的学习效率及稳定性,引入了非线性最小二乘法。通过与其他学习算法的比较,得出结论:其中用差商近似代替导数的Powel法是一种高效、快速的学习方法,其学习速率比带动量项的学习率自适应的BP算法高一个量级,而比DavidenFletcherPowel(DFP)、BroydenFletcherGoldfarlShanno(BFGS)等变尺度方法以及其他非线性最小二乘法的稳定性要好得多。 Feedforward neural network has a wide applied prospect in the modeling and control of nonlinear system, but the learning method of the network Backpropagation (BP) algorithm has some shortcomings. In order to improve the learning efficiency and stability of multi layer feedforward neural network, the least square method is introduced. Through the comparison with other learning methods, the conclusion is drawn that the Powell method in which the derivative is replaced by the difference quotient is an efficient and fast method. The learning speed of the Powell method is faster than the adapted learning rate BP method with momentum and the stability is much better than variable metric method such as Daviden Flecher Powell (DFP) and Broyden Flecher Goldfarl Shanno (BFGS) and other nonlinear least square method.
出处 《清华大学学报(自然科学版)》 EI CAS CSCD 北大核心 1999年第3期1-3,共3页 Journal of Tsinghua University(Science and Technology)
基金 国家攀登计划
关键词 前馈神经网络 学习算法 最小二乘法 神经网络 feedforward neural network learning method nonlinear least square method
  • 相关文献

参考文献8

二级参考文献5

共引文献161

同被引文献230

引证文献40

二级引证文献146

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部