摘要
超记忆梯度算法由于其迭代简单和较小的存储需求,在求解大规模无约束优化问题中起着特殊的作用.本文基于稀疏对角拟牛顿技术,结合修正Gu和Mo非单调线搜索步长规则,建立了求解大规模无约束最优化问题的非单调超记忆梯度新算法,给出了算法的全局收敛性分析.新算法具有算法稳定、计算简单的特点可用于求解病态和大规模问题.数值例子表明算法有效稳定.
The super-memory gradient method has played a special role for solving large-scale unconstrained optimization problems due to its simplicity and the very low storage. In this paper, by combining the diagonal-sparse quasi-Newton technique with the modified Gu and Mo non-monotone line search method, a new super-memory gradient method for unconstrained optimization problems is presented. The global convergence property of the new method is analyzed. The new method has two properties: it converges stably and can solve ill-conditioned problems, it only needs simple computation so as to solve large-scale problems. The numerical results show that the new method is effective and stable in practical computation.
出处
《工程数学学报》
CSCD
北大核心
2012年第3期375-385,共11页
Chinese Journal of Engineering Mathematics
基金
国家自然科学基金(10971118)
中央高校基本科研业务费专项资金(10CX04044A)~~
关键词
非线性规划
稀疏对角拟牛顿算法
非单调线搜索
超记忆梯度算法
收敛性
nonlinear programming
sparse diagonal quasi-Newton method
non-monotone line search
super-memory gradient method
convergence