In this paper, a nonmonotonic trust region method for optimization problems with equality constraints is proposed by introducing a nonsmooth merit function and adopting a correction step. It is proved that all accumul...In this paper, a nonmonotonic trust region method for optimization problems with equality constraints is proposed by introducing a nonsmooth merit function and adopting a correction step. It is proved that all accumulation points of the iterates generated by the proposed algorithm are Kuhn-Tucker points and that the algorithm is q-superlinearly convergent.展开更多
In this paper, we present a regularized Newton method (M-RNM) with correction for minimizing a convex function whose Hessian matrices may be singular. At every iteration, not only a RNM step is computed but also two c...In this paper, we present a regularized Newton method (M-RNM) with correction for minimizing a convex function whose Hessian matrices may be singular. At every iteration, not only a RNM step is computed but also two correction steps are computed. We show that if the objective function is LC<sup>2</sup>, then the method posses globally convergent. Numerical results show that the new algorithm performs very well.展开更多
We propose a smoothing trust region filter algorithm for nonsmooth nonconvex least squares problems. We present convergence theorems of the proposed algorithm to a Clarke stationary point or a global minimizer of the ...We propose a smoothing trust region filter algorithm for nonsmooth nonconvex least squares problems. We present convergence theorems of the proposed algorithm to a Clarke stationary point or a global minimizer of the objective function under certain conditions. Preliminary numerical experiments show the efficiency of the proposed algorithm for finding zeros of a system of polynomial equations with high degrees on the sphere and solving differential variational inequalities.展开更多
文摘In this paper, a nonmonotonic trust region method for optimization problems with equality constraints is proposed by introducing a nonsmooth merit function and adopting a correction step. It is proved that all accumulation points of the iterates generated by the proposed algorithm are Kuhn-Tucker points and that the algorithm is q-superlinearly convergent.
文摘In this paper, we present a regularized Newton method (M-RNM) with correction for minimizing a convex function whose Hessian matrices may be singular. At every iteration, not only a RNM step is computed but also two correction steps are computed. We show that if the objective function is LC<sup>2</sup>, then the method posses globally convergent. Numerical results show that the new algorithm performs very well.
基金supported by Hong Kong Research Grant Council(Grant No.Poly U5001/12p)National Natural Science Foundation of China(Grant No.11101231)
文摘We propose a smoothing trust region filter algorithm for nonsmooth nonconvex least squares problems. We present convergence theorems of the proposed algorithm to a Clarke stationary point or a global minimizer of the objective function under certain conditions. Preliminary numerical experiments show the efficiency of the proposed algorithm for finding zeros of a system of polynomial equations with high degrees on the sphere and solving differential variational inequalities.