摘要
共轭梯度法是求解无约束优化问题的一个有效方法。由于共轭梯度法只利用一阶梯度信息而忽略了目标函数值信息,故为了充分利用目标函数值信息值和梯度信息,结合Saman和Zahra Khoshgam等人提出的割线条件,针对TMDL和TMDL+方法给出了基于修正割线方程改进具有充分下降性的共轭梯度法。证明了STMDL方法在Wolfe线搜索下对一致凸函数强收敛,STMDL+方法对一般函数是全局收敛的。数值结果表明STMDL+方法优于HZ+和DK+方法。
Conjugate gradient method is an effective method to solve unconstrained optimization problems.Since the conjugate gradient method only uses one step information and ignores the objective function information,in order to make full use of the objective function information and gradient information,this paper combines the secant condition proposed by Saman et al.and Zahra Khoshgam et al,For the TMDL method and TMDL+method,a modified conjugate gradient method—STMDL method and STMDL+method based on modified secant equation,which is independent of line search and has sufficient descent,is presented.It is proved that STMDL method convergence strongly to uniform convex function under Wolfe line search,and STMDL+method global convergence to general function.Numerical results show STMDL+method is superior to HZ+method and DK+method.
作者
李月
陈贞晶
LI Yue;CHEN Zhenjing(School of Mathematics Science,Chongqing Normal University,Chongqing 401331,China)
出处
《东莞理工学院学报》
2021年第1期13-18,共6页
Journal of Dongguan University of Technology
基金
重庆市自然科学基金项目(cstc2017jcyjA0788)。
关键词
共轭梯度法
无约束优化
充分下降
全局收敛
conjugate gradient method
unconstrained optimization
sufficient descent
global convergence