In this paper, we propose an extended Levenberg-Marquardt (ELM) framework that generalizes the classic Levenberg-Marquardt (LM) method to solve the unconstrained minimization problem min ρ(r(x)), where r : R...In this paper, we propose an extended Levenberg-Marquardt (ELM) framework that generalizes the classic Levenberg-Marquardt (LM) method to solve the unconstrained minimization problem min ρ(r(x)), where r : Rn→ Rm and ρ : Rm → R. We also develop a few inexact variants which generalize ELM to the cases where the inner subproblem is not solved exactly and the Jaeobian is simplified, or perturbed. Global convergence and local superlinear convergence are established under certain suitable conditions. Numerical results show that our methods are promising.展开更多
Based on a class of functions, which generalize the squared Fischer-Burmeister NCP function and have many desirable properties as the latter function has, we reformulate nonlinear complementarity problem (NCP for shor...Based on a class of functions, which generalize the squared Fischer-Burmeister NCP function and have many desirable properties as the latter function has, we reformulate nonlinear complementarity problem (NCP for short) as an equivalent unconstrained optimization problem, for which we propose a derivative-free de- scent method in monotone case. We show its global convergence under some mild conditions. If F, the function involved in NCP, is Ro-function, the optimization problem has bounded level sets. A local property of the merit function is discussed. Finally, we report some numerical results.展开更多
In this paper, a conjugate-gradient method of invariancy to nonlinear scaling with respect to a conic function is proposed. This method may be used in the minimizer of a larger class of functions in a finite number of...In this paper, a conjugate-gradient method of invariancy to nonlinear scaling with respect to a conic function is proposed. This method may be used in the minimizer of a larger class of functions in a finite number of iterations, and this class of functions is more general than class of functions using the conjugate-gradient method of a conic function (to denote simply CCG)[1] to find its minimzer. In fact, this method is the extension of the CCG method. The results of the numerical evaluation show that the new method has a great effect.展开更多
文摘In this paper, we propose an extended Levenberg-Marquardt (ELM) framework that generalizes the classic Levenberg-Marquardt (LM) method to solve the unconstrained minimization problem min ρ(r(x)), where r : Rn→ Rm and ρ : Rm → R. We also develop a few inexact variants which generalize ELM to the cases where the inner subproblem is not solved exactly and the Jaeobian is simplified, or perturbed. Global convergence and local superlinear convergence are established under certain suitable conditions. Numerical results show that our methods are promising.
基金the National Natural Science Foundation of China
文摘Based on a class of functions, which generalize the squared Fischer-Burmeister NCP function and have many desirable properties as the latter function has, we reformulate nonlinear complementarity problem (NCP for short) as an equivalent unconstrained optimization problem, for which we propose a derivative-free de- scent method in monotone case. We show its global convergence under some mild conditions. If F, the function involved in NCP, is Ro-function, the optimization problem has bounded level sets. A local property of the merit function is discussed. Finally, we report some numerical results.
文摘In this paper, a conjugate-gradient method of invariancy to nonlinear scaling with respect to a conic function is proposed. This method may be used in the minimizer of a larger class of functions in a finite number of iterations, and this class of functions is more general than class of functions using the conjugate-gradient method of a conic function (to denote simply CCG)[1] to find its minimzer. In fact, this method is the extension of the CCG method. The results of the numerical evaluation show that the new method has a great effect.