摘要
该文利用凸函数共轭性质中的Young不等式构造前馈神经网络优化目标函数。这个优化目标函数若固定权值 ,对隐层输出来说为凸函数 ;若固定隐层输出 ,对权值来说为凸函数。因此 ,此目标函数不存在局部最小。此目标函数的优化速度快 ,大大提高了前馈神经网络的学习效率。仿真试验表明 ,与传统算法如误差反向传播算法或BP算法和含势态因子 (Mo mentumfactor)的BP算法及现有的分层优化算法相比 ,新算法能加快收敛速度 ,并降低学习误差。利用这种快速算法对矿体进行仿真预测 。
This paper presents a new algorithm for feed forward neural networks based on a new optimal target function constructed according to Young inequality in the conjugate of convex function. The optimal target function is a convex function to the weight values between neurons in different layers for aptotic outputs of hidden layers;and it is a convex function to outputs of hidden layers for aptotic weight values between neurons in different layers .The target function's optimal velocity is high,so it can boost learning efficiency of feed forward neural networks. Simulation shows that the new algorithm can accelerate the convergence rate and reduce the error compared with the existing algorithms such as backpropagation(BP) algorithm,BP algorithm with momentum factor and existing layer wise algorithms.It achieves more effective results for applying fast algorithm to deposit simulation forecasting.
出处
《计算机仿真》
CSCD
2004年第9期113-116,共4页
Computer Simulation
基金
中国科学院研究生院院长基金 (YZJJ2 0 0 2 0 6)