摘要
共轭梯度法是优化算法中最常用的方法之一,适于解决大规模问题,因而有着广泛的应用,而标量kβ不同的选取可以构成不同的共轭梯度法。修正了共轭梯度法中的标量kβ,将其推广到一般情况,并在wolfe线搜索下证明了它的全局收敛性。
The conjugate gradient method is a methed most in use for oprimization problems.It has found wide application as it is quite suitable for solving large scale optimization problems.However,different conjugate gradient method has to be adopted for different selection of βk.By modifying βk,a new conjugate gradient method is proposed which makes if possible to solve unconstrained optimization problems.Global convergence of the new method is proved by means of wolfe line search.
出处
《内蒙古工业大学学报(自然科学版)》
2011年第2期98-101,共4页
Journal of Inner Mongolia University of Technology:Natural Science Edition
关键词
无约束优化问题
共轭梯度法
WOLFE线搜索
全局收敛性
unconstrained optomization problem
conjugate gradient method
wolfe line search
globall convergence