期刊文献+

基于修改误差函数新的BP学习算法 被引量:10

New Learning Algorithm of Neural Network Based on Modified Error Function
下载PDF
导出
摘要 通过分析隐层神经元饱和度对网络性能的影响,构造了新的误差函数,同时设计了一种自适应调节的放大误差信号方法,得到新的BP学习算法。该算法流程简单,不需要太大的计算复杂性。仿真实验结果表明新改进算法在收敛速度和避免误差函数陷入局部极小方面明显优于其它BP算法。 By analyzing the influences of saturation degree in the hidden layer on the performances of multi-layer feedforward neural networks, a new error function was constructed, a new adaptive method of magnifying error signal was designed, and an improved back-propagation algorithm was proposed. In addition, the flow is simple and no heavy computational is necessary in the proposed algorithm. The results show that, in terms of the convergence rate and the capability of avoiding local minima, the new algorithm always outperforms the other traditional methods.
出处 《系统仿真学报》 EI CAS CSCD 北大核心 2007年第19期4591-4593,4598,共4页 Journal of System Simulation
关键词 前馈神经网络 学习算法 饱和度 局部极小 误差信号 feedforward neuralnetworks learning algorithm saturation degree local minima error signal
  • 相关文献

参考文献9

  • 1Y H Zweiri,J F Whidborne,L D Seneviratne.A three-term backpropagation algorithm[J].Neurocomputing (S0925-2312),2003,50(5):305-318.
  • 2Gao Daqi,Yang Genxing.Influences of variable scales and activation functions on the performances of multilayer feedforward neural networks[J].Pattern recognition (S0031-3203),2003,36(4):869-878.
  • 3Nicholas K Treadgold,Tamas D Gedeon.Simulated annealing and weight decay in adaptive learning:the SARPROP algorithm[J].IEEE Transaction on neural networks (S1045-9227),1998,9(4):662-668.
  • 4Sin-Chun Ng,Chi-Chung Cheung,Shu-Hung.Magnifed gradient function with deterministic weight modification in adaptive learning[J].IEEE Transaction on neural networks (S1045-9227),2004,15(6):1411-1423.
  • 5Minghu Jiang,Beixing Deng,Bin Wang,Bo Zhang.A fast learning algorithm of neural networks by changing error function[J].IEEE Int.Conf.Neural Networks & Signal Processing (S1045-9227),2003,1(2):249-252.
  • 6Lodewyk F A,Wessels Etienne Barnard.Avoiding false local minima by proper initialization of connections[J].IEEE Transaction on neural networks (S1045-9227),1992,3(6):899-905.
  • 7Xiao-Hu Yu,Guo-An Chen.Efficient Backpropagation Learing Using Optimal Learning Rate and Momentum[J].Neural Networks (S0893-6080),1997,10(3):517-527.
  • 8Yu X H,Chen G A,Cheng S X.Dynamic learning rate optimization of the backpropagation algorithm[J].IEEE Transactions on Neural Networks (S1045-9227),1995,6(3):669-677.
  • 9Simon Haykin.Neural Networks,a Comprehensive Foundation[M].Beijing:China Machine Press,2004:109-175.

同被引文献81

引证文献10

二级引证文献34

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部