期刊文献+
共找到3篇文章
< 1 >
每页显示 20 50 100
Feasible SQP Descent Method for Inequality Constrained Optimization Problems and Its Convergence 被引量:1
1
作者 张和平 叶留青 《Chinese Quarterly Journal of Mathematics》 CSCD 2009年第3期469-474,共6页
In this paper,the new SQP feasible descent algorithm for nonlinear constrained optimization problems presented,and under weaker conditions of relative,we proofed the new method still possesses global convergence and i... In this paper,the new SQP feasible descent algorithm for nonlinear constrained optimization problems presented,and under weaker conditions of relative,we proofed the new method still possesses global convergence and its strong convergence.The numerical results illustrate that the new methods are valid. 展开更多
关键词 nonlinearly constrained optimization SQP the generalized projection line search global convergence strong convergence.
下载PDF
A Fastly Convergent Directly Feasible Directions Method for Nonlinearly Inequality Constrained Optimization
2
作者 JIAN Jinbao Math. and Information Science Department, Guangxi University, Nanning530004, China 《Systems Science and Systems Engineering》 CSCD 1997年第1期5-14,共10页
In this paper, a new superlinearly convergent algorithm for nonlinearly constrained optimization problems is presented. The search directions are directly computed by a few formulas, and neither quadratic programming ... In this paper, a new superlinearly convergent algorithm for nonlinearly constrained optimization problems is presented. The search directions are directly computed by a few formulas, and neither quadratic programming nor linear equation need to be sovled. Under mild assumptions, the new algorithm is shown to possess global and superlinear convergence. 展开更多
关键词 nonlinearly constrained optimization direct search directions feasible directions method global and superlinear convergence.
原文传递
Analysis on a Superlinearly Convergent Augmented Lagrangian Method 被引量:2
3
作者 Ya Xiang YUAN 《Acta Mathematica Sinica,English Series》 SCIE CSCD 2014年第1期1-10,共10页
The augmented Lagrangian method is a classical method for solving constrained optimization.Recently,the augmented Lagrangian method attracts much attention due to its applications to sparse optimization in compressive... The augmented Lagrangian method is a classical method for solving constrained optimization.Recently,the augmented Lagrangian method attracts much attention due to its applications to sparse optimization in compressive sensing and low rank matrix optimization problems.However,most Lagrangian methods use first order information to update the Lagrange multipliers,which lead to only linear convergence.In this paper,we study an update technique based on second order information and prove that superlinear convergence can be obtained.Theoretical properties of the update formula are given and some implementation issues regarding the new update are also discussed. 展开更多
关键词 nonlinearly constrained optimization augmented Lagrange function Lagrange multiplier convergence
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部