摘要
为解决大规模小波神经网络的优化问题,提出了一种快速的拟牛顿学习算法,即使用改进Wolfe线搜索的仅存储梯度向量拟牛顿算法.该算法每次迭代中最多计算两次梯度,并且计算中仅需存储递度向量,避开了近似Hessian矩阵的存储问题,从而大大降低了计算量和存储需求.仿真验证了算法的有效性和可行性.
To solve the optimization problem of large scale wavelet neural network, a fast Quasi - Newton learn- ing algorithm, which only saves gradient vector Quasi -Newton algorithm by using an improved Wolfe line search, is proposed. In every iteration step, the algorithm needs to calculate two gradients at most, and only saves the gradient vector rather than Hessian approximation matrix, which can reduce the computation and mere- ory requirements. Its validity and feasibility are proved by the simulations
出处
《昆明理工大学学报(自然科学版)》
CAS
北大核心
2013年第6期54-60,共7页
Journal of Kunming University of Science and Technology(Natural Science)
基金
陕西省自然科学基金项目(2007F49)
关键词
小波
神经网络
无约束最优化问题
拟牛顿算法
WOLFE线搜索
wavelet
neural network
unconstrained optimization problem
quasi - Newton algorithm
Wolfeline search