In this paper, we provide and analyze a new scaled conjugate gradient method and its performance, based on the modified secant equation of the Broyden-Fletcher-Goldfarb-Shanno (BFGS) method and on a new modified nonmo...In this paper, we provide and analyze a new scaled conjugate gradient method and its performance, based on the modified secant equation of the Broyden-Fletcher-Goldfarb-Shanno (BFGS) method and on a new modified nonmonotone line search technique. The method incorporates the modified BFGS secant equation in an effort to include the second order information of the objective function. The new secant equation has both gradient and function value information, and its update formula inherits the positive definiteness of Hessian approximation for general convex function. In order to improve the likelihood of finding a global optimal solution, we introduce a new modified nonmonotone line search technique. It is shown that, for nonsmooth convex problems, the proposed algorithm is globally convergent. Numerical results show that this new scaled conjugate gradient algorithm is promising and efficient for solving not only convex but also some large scale nonsmooth nonconvex problems in the sense of the Dolan-Moré performance profiles.展开更多
A new limited memory symmetric rank one algorithm is proposed. It combines a modified self-scaled symmetric rank one (SSR1) update with the limited memory and nonmonotone line search technique. In this algorithm, th...A new limited memory symmetric rank one algorithm is proposed. It combines a modified self-scaled symmetric rank one (SSR1) update with the limited memory and nonmonotone line search technique. In this algorithm, the descent search direction is generated by inverse limited memory SSR1 update, thus simplifying the computation. Numerical comparison of the algorithm and the famous limited memory BFGS algorithm is given. Comparison results indicate that the new algorithm can process a kind of large-scale unconstrained optimization problems.展开更多
The self-scaling quasi-Newton method solves an unconstrained optimization problem by scaling the Hessian approximation matrix before it is updated at each iteration to avoid the possible large eigenvalues in the Hessi...The self-scaling quasi-Newton method solves an unconstrained optimization problem by scaling the Hessian approximation matrix before it is updated at each iteration to avoid the possible large eigenvalues in the Hessian approximation matrices of the objective function. It has been proved in the literature that this method has the global and superlinear convergence when the objective function is convex (or even uniformly convex). We propose to solve unconstrained nonconvex optimization problems by a self-scaling BFGS algorithm with nonmonotone linear search. Nonmonotone line search has been recognized in numerical practices as a competitive approach for solving large-scale nonlinear problems. We consider two different nonmonotone line search forms and study the global convergence of these nonmonotone self-scale BFGS algorithms. We prove that, under some weaker condition than that in the literature, both forms of the self-scaling BFGS algorithm are globally convergent for unconstrained nonconvex optimization problems.展开更多
This paper presents a unified bination algorithms (such as FrankWolfe problems. Global convergence results are framework of the nonmonotone convex comAlgorithm) for solving the traffic assignment established under m...This paper presents a unified bination algorithms (such as FrankWolfe problems. Global convergence results are framework of the nonmonotone convex comAlgorithm) for solving the traffic assignment established under mild conditions. The line search procedure used in our algorithm includes the nonmonotone Armijo rule, the non- monotone Goldstein rule and the nonmonotone Wolfe rule as special cases. So, the new algorithm can be viewed as a generalization of the regular convex combination algorithm.展开更多
文摘In this paper, we provide and analyze a new scaled conjugate gradient method and its performance, based on the modified secant equation of the Broyden-Fletcher-Goldfarb-Shanno (BFGS) method and on a new modified nonmonotone line search technique. The method incorporates the modified BFGS secant equation in an effort to include the second order information of the objective function. The new secant equation has both gradient and function value information, and its update formula inherits the positive definiteness of Hessian approximation for general convex function. In order to improve the likelihood of finding a global optimal solution, we introduce a new modified nonmonotone line search technique. It is shown that, for nonsmooth convex problems, the proposed algorithm is globally convergent. Numerical results show that this new scaled conjugate gradient algorithm is promising and efficient for solving not only convex but also some large scale nonsmooth nonconvex problems in the sense of the Dolan-Moré performance profiles.
基金the National Natural Science Foundation of China(10471062)the Natural Science Foundation of Jiangsu Province(BK2006184)~~
文摘A new limited memory symmetric rank one algorithm is proposed. It combines a modified self-scaled symmetric rank one (SSR1) update with the limited memory and nonmonotone line search technique. In this algorithm, the descent search direction is generated by inverse limited memory SSR1 update, thus simplifying the computation. Numerical comparison of the algorithm and the famous limited memory BFGS algorithm is given. Comparison results indicate that the new algorithm can process a kind of large-scale unconstrained optimization problems.
基金supported by NSFC 10001031 and 70472074supported by NSERC Grant 283103
文摘The self-scaling quasi-Newton method solves an unconstrained optimization problem by scaling the Hessian approximation matrix before it is updated at each iteration to avoid the possible large eigenvalues in the Hessian approximation matrices of the objective function. It has been proved in the literature that this method has the global and superlinear convergence when the objective function is convex (or even uniformly convex). We propose to solve unconstrained nonconvex optimization problems by a self-scaling BFGS algorithm with nonmonotone linear search. Nonmonotone line search has been recognized in numerical practices as a competitive approach for solving large-scale nonlinear problems. We consider two different nonmonotone line search forms and study the global convergence of these nonmonotone self-scale BFGS algorithms. We prove that, under some weaker condition than that in the literature, both forms of the self-scaling BFGS algorithm are globally convergent for unconstrained nonconvex optimization problems.
基金This research is partly supported by National Outstanding Young Investigator Grant(70225005) of National Natural Science Foundation of China and the Project(70471088) of National Natural Science Foundation of China.
文摘This paper presents a unified bination algorithms (such as FrankWolfe problems. Global convergence results are framework of the nonmonotone convex comAlgorithm) for solving the traffic assignment established under mild conditions. The line search procedure used in our algorithm includes the nonmonotone Armijo rule, the non- monotone Goldstein rule and the nonmonotone Wolfe rule as special cases. So, the new algorithm can be viewed as a generalization of the regular convex combination algorithm.