期刊文献+

基于最小二乘修正的混合HS和DY共轭梯度法

The Modified Hybrid Hestenes-Stiefel and Dai-Yuan Methods Based on the Least Squares
下载PDF
导出
摘要 共轭梯度法主要用来求解大规模无约束问题,具有所需存储量小、强收敛性和计算方便等特点。针对混合的Hestenes-Stiefel和Dai-Yuan共轭梯度方法,采用最小二乘的思想,通过极小化混合的方法和充分下降的三项共轭梯度法的搜索方向之间的距离之差,求解混合参数,使得混合的Hestenes-Stiefel和Dai-Yuan方法在Wolfe线搜索下满足充分下降性和对一致凸函数全局收敛。与Hager-Zhang和Dai-Kou提出的方法比较,修正后的方法在计算上更有优势。 Conjugate gradient method is mainly used to solve large-scale unconstrained problems,which has the characteristics of low storage requirement,strong global convergence and simple computation.For establishing the hybrid Hestenes-Stiefel and Dai-Yuan conjugate gradient methods,the idea of the least square method is adopted to solve the hybrid parameters based on the distance difference between search directions of minimizing mixing method and sufficient descent three-term conjugate gradient method.Under the Wolfe line search,the hybrid Hestenes-Stiefel and Dai-Yuan conjugate gradient methods have sufficient descent condition and guarantee the global convergence for the uniformly convex functions.The new method is more computatively advantageous than the Hager-Zhang and Dai-Kou conjugate gradient methods.
作者 陈贞晶 CHEN Zhenjing(School of Mathematical Sciences,Chongqing Normal University,Chongqing 401331,China)
出处 《重庆科技学院学报(自然科学版)》 CAS 2019年第5期44-49,共6页 Journal of Chongqing University of Science and Technology:Natural Sciences Edition
基金 重庆市教委科学技术研究计划项目“快速分布式在线学习算法研究及应用”(KJQN201800520)
关键词 共轭梯度法 WOLFE线搜索 最小二乘法 一致凸函数 全局收敛 conjugate gradient method Wolfe line search least squares uniformly convex function global convergence
  • 相关文献

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部