摘要
In this paper, a Newton like method for solving minimax optimization problems was proposed. The method belong to sequential quadratic programming method,the Hessian of quadratic programming subproblem is a convex combination of Hes-sian of objective functions. When Hessian of quadratic programming subproblem is not positive definite, the strategy to force matrix positive definite is used, so that there are good numerical solution for quadratic programming subproblem.The paper prove that the algorithm has global convergence and q-superlinear con-vergence properties. In order to show the new algorithm having good results, our preliminary numerical experiments are also reported.
In this paper, a Newton like method for solving minimax optimization problems
was proposed. The method belong to sequential quadratic programming method,
the Hessian of quadratic programming subproblem is a convex combination of Hes-
sian of objective functions. When Hessian of quadratic programming subproblem
is not positive definite, the strategy to force matrix positive definite is used, so
that there are good numerical solution for quadratic programming subproblem.
The paper prove that the algorithm has global convergence and q-superlinear con-
vergence properties. In order to show the new algorithm having good results, our
preliminary numerical experiments are also reported.
出处
《数值计算与计算机应用》
CSCD
北大核心
2004年第2期108-115,共8页
Journal on Numerical Methods and Computer Applications
基金
国家自然科学基金(19971008)
北京市教委基金