期刊文献+

对共轭梯度法中标量β_k的一种修正

A MODIFICATION OF SCALAR β_k FOR CONJUGATE GRADIENT METHOD
下载PDF
导出
摘要 共轭梯度法是优化算法中最常用的方法之一,适于解决大规模问题,因而有着广泛的应用,而标量kβ不同的选取可以构成不同的共轭梯度法。修正了共轭梯度法中的标量kβ,将其推广到一般情况,并在wolfe线搜索下证明了它的全局收敛性。 The conjugate gradient method is a methed most in use for oprimization problems.It has found wide application as it is quite suitable for solving large scale optimization problems.However,different conjugate gradient method has to be adopted for different selection of βk.By modifying βk,a new conjugate gradient method is proposed which makes if possible to solve unconstrained optimization problems.Global convergence of the new method is proved by means of wolfe line search.
出处 《内蒙古工业大学学报(自然科学版)》 2011年第2期98-101,共4页 Journal of Inner Mongolia University of Technology:Natural Science Edition
关键词 无约束优化问题 共轭梯度法 WOLFE线搜索 全局收敛性 unconstrained optomization problem conjugate gradient method wolfe line search globall convergence
  • 相关文献

参考文献5

  • 1NOCEDAL J, WRIGHT JS. Numerical Optimization[M]. New York : Springer- Verlag, 19 9 9.
  • 2HAGER. W. W, ZHANG, H. A New Conjugate Gradient Method with Guarranteed Descent and An Efficient Line Search[J]. SIAM,2005,16(15) - 170- 192.
  • 3ZHANG LI,ZHOU Wei-jun. Two Descent Hybrid Conjugate Gradient Methods for Optimization[J]. Comput Appl Math ,2008,216(45) :251-264.
  • 4ZOUTENDIJK ,G. Nonlinear Programming Computational Methods[A]. in:J, Abadied Integer and Nonlinear Pro- gramming[M]. Amsterdam: North- Holland, 1970,37 - 86.
  • 5DAI Y H, Y X. A Nonlinear Conjugate Gradient Method with A Strong Global Convergence Property[J].SIAM. Optimization . 1999,10(1) - 177- 182.

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部