摘要
本文将无约束超记忆梯度法推广到非线性不等式约束优化问题上来,给出了两类形式很一般的超记忆可行方向法,并在非退化及连续可微等较弱的假设下证明了其全局收敛性.适当选取算法中的参量及记忆方向,不仅可得到一些已知的方法及新方法,而且还可能加快算法的收敛速度.
This paper extends the supermemory gradient methods for unconstrained optimization to nonlinear inequality consirained optimization and presents two classes of supermemory feasible direction methods with general forms. Under the assumptions of nondegeneracy and continuous differentiability, any algorithm in these ciasses is globally convergent. When selecting proper parameters and memory directions in the methods, some known methods and new methods are obtained, and the rate of convergence of the methods may be increased.
出处
《应用数学》
CSCD
北大核心
1992年第1期22-28,共7页
Mathematica Applicata
基金
校青年科学基金
关键词
约束优化
超记忆
可行方向法
Constrained optimization
Supermemory feasible direction method
Global convergence