摘要
本文基于修正的共轭梯度公式,提出了一个具有充分下降性的共轭梯度算法,该算法不需要线搜索,其步长由固定的公式给出.某种程度上,该算法利用了目标函数的二次信息,对目标函数的(近似)二次模型采取了精确线搜索,每步都只需要计算一次梯度值,特别适合大规模优化计算.本文还给出了该算法的全局收敛性分析,并得到强收敛结果.数值实验表明这种算法是很有应用前景的.
In this paper, based on a modified conjugate gradient formula, a conjugate gradient method with sufficient descent property was proposed. No line search is required during the process, in which the step-length was computed by a fixed formula. In a certain extent, the step-length can be seen as the one-dimensional minimizer of a quadratic model. So, it is very suited for large-scale optimization problems since it just needs to compute one gradient value at every iteration. Globally convergent analysis was given in this paper. Preliminary numerical results show that this method is very promising.
出处
《数值计算与计算机应用》
CSCD
2006年第3期183-190,共8页
Journal on Numerical Methods and Computer Applications
基金
国家自然科学基金(60475042)
香港中山大学高等学术研究中心资助.
关键词
无约束优化
共轭梯度方法
线搜索
全局收敛性
unconstrained optimization, conjugate gradient method, line search, global convergence