摘要
非线性共轭梯度法由于其迭代简单和储存量小,且搜索方向不需要满足正割条件,在求解大规模无约束优化问题时占据及其重要的地位.提出了一类新的共轭梯度法,其搜索方向是目标函数的下降方向.若假设目标函数连续可微且梯度满足Lipschitz条件,线性搜索满足Wolfe原则,讨论了所设计算法的全局收敛性.
Nonlinear conjugate gradient methods have played zation problems, it is characterized by the simplicity of their an important role in solving large scale unconstrained optimi- iteration and their low memory requirements. It is well-known that the direction generated by a conjugate gradient method may be not a descent direction. In this paper, a new class of nonlinear conjugate gradient method is presented, its search direction is a descent direction for the objective function. If the objective function is differentiable and its gradient is Lipschitz continuous, the line search satisfies strong Wolfe condition, the global convergence result is established.
出处
《长江大学学报(自科版)(上旬)》
CAS
2014年第3期I0001-I0003,共3页
JOURNAL OF YANGTZE UNIVERSITY (NATURAL SCIENCE EDITION) SCI & ENG
关键词
摘要
编辑部
编辑工作
读者
conjugate gradient method
line search
global convergence
unconstrained optimization