期刊文献+
共找到1篇文章
< 1 >
每页显示 20 50 100
EFFICIENT GRADIENT DESCENT METHOD OFRBF NEURAL ENTWORKS WITHADAPTIVE LEARNING RATE
1
作者 Lin Jiayu Liu Ying(School of Electro. Sci. and Tech., National Univ. of Defence Technology, Changsha 410073) 《Journal of Electronics(China)》 2002年第3期255-258,共4页
A new algorithm to exploit the learning rates of gradient descent method is presented, based on the second-order Taylor expansion of the error energy function with respect to learning rate, at some values decided by &... A new algorithm to exploit the learning rates of gradient descent method is presented, based on the second-order Taylor expansion of the error energy function with respect to learning rate, at some values decided by "award-punish" strategy. Detailed deduction of the algorithm applied to RBF networks is given. Simulation studies show that this algorithm can increase the rate of convergence and improve the performance of the gradient descent method. 展开更多
关键词 Gradient descent method Learning rate RBF neural networks
下载PDF
上一页 1 下一页 到第
使用帮助 返回顶部