摘要
用于训练多层前向神经网络的具有自适应调整学习步长的BP算法,系用梯度优化技术来自动修正学习步长以使目标函数极小化,从而克服了凭经验选择学习步长的不足,加快了神经网络的训练速度。
A backpropagation (BP) algorithm with adaptive learning step sizes for training multilayer feedforward neural networks is presented in this paper. The gradient optimal technique is used to adjust the learning step sizes automatically to minimize the cost function. The proposed new algorithm has overcome the drawbacks of the conventional learning step sizes selection based upon empirical methods and accelerated the training speed of the neural network. In this paper,the discussion about the convergence of the mean weight is also given.
出处
《桂林电子工业学院学报》
1993年第2期38-45,共8页
Journal of Guilin Institute of Electronic Technology
关键词
神经网络
学习速率
自适应算法
neural networks
learning rate
adaptive algorithm