摘要
BP神经网络目前被广泛应用,但是其收敛速度慢、预测精度不高的缺点却一直被人所诟病,因此,在传统BP神经网络中使用附加动量项法以及动态学习速率法,并以两者的融合为基础提出了陡峭因子可调激活函数法来改进BP神经网络。以非线性函数拟合为实例,从收敛速度和预测精度两方面对比分析两种方法,实验证明所提出的改进方法明显提高BP神经网络的收敛速度以及精度。
Nowadays,BP(Back Propagation) neural network is widely used. However,the disadvantages such as slow convergence,low prediction precision were criticized. Hence,in this paper,a new method of BP neural network is proposed to improve the BP neural network by using the method of steep factor variant activation Tanh function which is based on additional momentum term and dynamic learning rate. Taking nonlinear function fitting as example,this paper compares traditional BP neural network with improved one. The result of comparison proves that the proposed method can significantly improve the convergence speed and prediction precision of BP neural network.
出处
《微型机与应用》
2017年第6期53-57,61,共6页
Microcomputer & Its Applications
关键词
BP神经网络
附加动量项
动态学习速率
陡峭因子可调激活函数
BP neural network
additional momentum term
dynamic study rate
steep factor variant activation Tanh function