摘要
为了更加有效的求解大规模无约束优化问题,本文基于自调比无记忆BFGS拟牛顿法,提出一个自适应双参数共轭梯度法,设计的搜索方向满足充分下降性,在一般假设和标准Wolfe线搜索准则下,证明该方法具有全局收敛性,数值实验结果证明提出的新算法是有效的.
In order to solve large-scale unconstrained optimization problems more effectively,this paper proposes an adaptive two-parameter conjugate gradient method based on the self-scaling memoryless BFGS quasi-Newton method,and the designed search direction satisfies sufficient descending,and under the general assumption and standard Wolfe line search criterion,it is proved that the proposed method has global convergence,and numerical experimental results prove that the proposed new algorithm is effective.
作者
李向利
莫元健
梅建平
LI Xiangli;MO Yuanjian;MEI Jianping(School of Mathematics and Computing Science,Guilin University of Electronic Technology,Guilin 541004,China;Guangxi Colleges and University Key Laboratory of Data Analysis and Computation,Guilin 541004,China;Center for Applied Mathematics of Guangxi(GUET),Guilin 541004,China)
出处
《应用数学》
北大核心
2024年第1期89-99,共11页
Mathematica Applicata
基金
国家自然科学基金(11961010,61967004)
桂林电子科技大学研究生创新项目(2023YCXS115)。
关键词
大规模无约束优化
共轭梯度法
WOLFE线搜索
全局收敛性
Large-scale unconstrained optimization
Conjugate gradient method
Wolfe Line Search
Global convergence