By making a convex combination of the modified secant equations proposed by Yuan and Wei et al., a hybrid secant equation and also, a modified BFGS algorithm is proposed. The hybridization parameter is effectively com...By making a convex combination of the modified secant equations proposed by Yuan and Wei et al., a hybrid secant equation and also, a modified BFGS algorithm is proposed. The hybridization parameter is effectively computed using the available information of recent iterations. Under proper conditions, it is shown that the proposed algorithm is globally, locally and superlinearly convergent. By using the performance profile introduced by Dolan and Mor6, a comparison between the implementations of the proposed algorithm and two efficient modified BFGS algorithms proposed by Yuan and Wei et al., on a set of unconstrained optimization test problems from the CUTEr collection, is done. Numerical results demonstrating the efficiency of the proposed modified BFGS algorithm are reported.展开更多
Conjugate gradient methods have played a special role in solving large scale nonlinear problems. Recently, the author and Dai proposed an efficient nonlinear conjugate gradient method called CGOPT, through seeking the...Conjugate gradient methods have played a special role in solving large scale nonlinear problems. Recently, the author and Dai proposed an efficient nonlinear conjugate gradient method called CGOPT, through seeking the conjugate gradient direction closest to the direction of the scaled memoryless BFGS method. In this paper, we make use of two types of modified secant equations to improve CGOPT method. Under some assumptions, the improved methods are showed to be globally convergent. Numerical results are also reported.展开更多
In this paper, a modified limited memory BFGS method for solving large-scale unconstrained optimization problems is proposed. A remarkable feature of the proposed method is that it possesses global convergence propert...In this paper, a modified limited memory BFGS method for solving large-scale unconstrained optimization problems is proposed. A remarkable feature of the proposed method is that it possesses global convergence property without convexity assumption on the objective function. Under some suitable conditions, the global convergence of the proposed method is proved. Some numerical results are reported which illustrate that the proposed method is efficient.展开更多
基金the Research Council of Semnan University for its support
文摘By making a convex combination of the modified secant equations proposed by Yuan and Wei et al., a hybrid secant equation and also, a modified BFGS algorithm is proposed. The hybridization parameter is effectively computed using the available information of recent iterations. Under proper conditions, it is shown that the proposed algorithm is globally, locally and superlinearly convergent. By using the performance profile introduced by Dolan and Mor6, a comparison between the implementations of the proposed algorithm and two efficient modified BFGS algorithms proposed by Yuan and Wei et al., on a set of unconstrained optimization test problems from the CUTEr collection, is done. Numerical results demonstrating the efficiency of the proposed modified BFGS algorithm are reported.
基金supported by National Natural Science Foundation of China(Grant Nos.10831006 and 10971017)
文摘Conjugate gradient methods have played a special role in solving large scale nonlinear problems. Recently, the author and Dai proposed an efficient nonlinear conjugate gradient method called CGOPT, through seeking the conjugate gradient direction closest to the direction of the scaled memoryless BFGS method. In this paper, we make use of two types of modified secant equations to improve CGOPT method. Under some assumptions, the improved methods are showed to be globally convergent. Numerical results are also reported.
基金Supported by National Natural Science Foundation of China(Grant11001075,11161003)Post-doctoral Foundation of China grant 20090461094the Natural Science Foundation of Henan Province Eduction Department grant 2010B110004
文摘In this paper, a modified limited memory BFGS method for solving large-scale unconstrained optimization problems is proposed. A remarkable feature of the proposed method is that it possesses global convergence property without convexity assumption on the objective function. Under some suitable conditions, the global convergence of the proposed method is proved. Some numerical results are reported which illustrate that the proposed method is efficient.