摘要
对无约束优化算法进行了研究。描述了最速下降算法、牛顿法、非线性FR共轭梯度法、非线性PRP共轭梯度法、非线性DY共轭梯度法等求解大规模无约束优化问题的有效算法以及精确线搜索、Wolfe线搜索、Armijo线搜索的搜索条件;着重研究了计算更为有效的适合求解无约束优化问题的超记忆梯度算法;在一类Wolfe型非精确线搜索条件下给出了一类超记忆梯度算法,并且在较弱的条件下证明了算法的全局收敛性,为求解大规模无约束优化问题以及各种算法的比较提供了参考。
The unconstrained optimization problem was discussed. The steepest descent method, Newton method, nonlinear FR conjugate gradient method, nonlinear PRP conjugate gradient method, nonlinear DY conjugate gradient and etc ,which are for the large scale unconstrained optimization problems, were described. Also discussed exact line search, Wolfe line search and Armijo line search. Super memory gradient method ,which is one of the efficient methods for solving unconstrained optimization problems, was stressed in this article. A class of super memory gradient algorithm under the condition of Wolfe-type line search was presented. Its global convergence was also given under mild conditions. This will improve the efficiency of the unconstrained optimization algorithm and compare them with each other.
出处
《上海第二工业大学学报》
2006年第2期142-146,共5页
Journal of Shanghai Polytechnic University
关键词
无约束优化
超记忆梯度法
全局收敛性
unconstrained optimization
super memory gradient method
global convergence.