摘要
研究一类新的记忆梯度法,算法利用当前点的负梯度和前一点的搜索方向的线性组合为搜索方向,以强wolfe线搜索确定步长,并证明了算法具有全局收敛性,当目标函数一致凸时讨论了收敛速度.
a new class of memory gradient method is proposed for unconstrained optimization problems. The algorithm uses the linear combination of negative gradient and its previous search direction as a search direction, and uses strong Wolfe line search to the step-size. The global convergence is proved and the linear convergence rate is investigated when the objective function is uniformly convex.
出处
《西南民族大学学报(自然科学版)》
CAS
2008年第1期65-69,共5页
Journal of Southwest Minzu University(Natural Science Edition)
关键词
无约束优化
记忆梯度法
强wolfe线搜索
收敛性
线性收敛速度
unconstrained optimization
memory gradient method
strong Wolfe line search
global convergence
linear convergence rate