In this paper,an adaptive three-term conjugate gradient method is proposed for solving unconstrained problems,which generates sufficient descent directions at each iteration.Different from the existent methods,a dynam...In this paper,an adaptive three-term conjugate gradient method is proposed for solving unconstrained problems,which generates sufficient descent directions at each iteration.Different from the existent methods,a dynamical adjustment between Hestenes–Stiefel and Dai–Liao conjugacy conditions in our proposed method is developed.Under mild condition,we show that the proposed method converges globally.Numerical experimentation with the new method indicates that it efficiently solves the test problems and therefore is promising.展开更多
A hybridization of the three–term conjugate gradient method proposed by Zhang et al. and the nonlinear conjugate gradient method proposed by Polak and Ribi`ere, and Polyak is suggested. Based on an eigenvalue analysi...A hybridization of the three–term conjugate gradient method proposed by Zhang et al. and the nonlinear conjugate gradient method proposed by Polak and Ribi`ere, and Polyak is suggested. Based on an eigenvalue analysis, it is shown that search directions of the proposed method satisfy the sufficient descent condition, independent of the line search and the objective function convexity. Global convergence of the method is established under an Armijo–type line search condition. Numerical experiments show practical efficiency of the proposed method.展开更多
It is well-known that the direction generated by Hestenes-Stiefel (HS) conjugate gradient method may not be a descent direction for the objective function. In this paper, we take a little modification to the HS meth...It is well-known that the direction generated by Hestenes-Stiefel (HS) conjugate gradient method may not be a descent direction for the objective function. In this paper, we take a little modification to the HS method, then the generated direction always satisfies the sufficient descent condition. An advantage of the modified Hestenes-Stiefel (MHS) method is that the scalar βκHS. keeps nonnegative under the weak Wolfe-Powell line search. The global convergence result of the MHS method is established under some mild conditions. Preliminary numerical results show that the MHS method is a little more efficient than PRP and HS methods.展开更多
A new adaptive subspace minimization three-term conjugate gradient algorithm with nonmonotone line search is introduced and analyzed in this paper.The search directions are computed by minimizing a quadratic approxima...A new adaptive subspace minimization three-term conjugate gradient algorithm with nonmonotone line search is introduced and analyzed in this paper.The search directions are computed by minimizing a quadratic approximation of the objective function on special subspaces,and we also proposed an adaptive rule for choosing different searching directions at each iteration.We obtain a significant conclusion that the each choice of the search directions satisfies the sufficient descent condition.With the used nonmonotone line search,we prove that the new algorithm is globally convergent for general nonlinear functions under some mild assumptions.Numerical experiments show that the proposed algorithm is promising for the given test problem set.展开更多
Two fundamental convergence theorems are given for nonlinear conjugate gradient methods only under the descent condition. As a result, methods related to the Fletcher-Reeves algorithm still converge for parameters in ...Two fundamental convergence theorems are given for nonlinear conjugate gradient methods only under the descent condition. As a result, methods related to the Fletcher-Reeves algorithm still converge for parameters in a slightly wider range, in particular, for a parameter in its upper bound. For methods related to the Polak-Ribiere algorithm, it is shown that some negative values of the conjugate parameter do not prevent convergence. If the objective function is convex, some convergence results hold for the Hestenes-Stiefel algorithm.展开更多
基金This work was supported by First-Class Disciplines Foundation of Ningxia Hui Autonomous Region(No.NXYLXK2017B09)the National Natural Science Foundation of China(Nos.11601012,11861002,71771030)+3 种基金the Key Project of North Minzu University(No.ZDZX201804)Natural Science Foundation of Ningxia Hui Autonomous Region(Nos.NZ17103,2018AAC03253)Natural Science Foundation of Guangxi Zhuang Autonomous Region(No.2018GXNSFAA138169)Guangxi Key Laboratory of Cryptography and Information Security(No.GCIS201708).
文摘In this paper,an adaptive three-term conjugate gradient method is proposed for solving unconstrained problems,which generates sufficient descent directions at each iteration.Different from the existent methods,a dynamical adjustment between Hestenes–Stiefel and Dai–Liao conjugacy conditions in our proposed method is developed.Under mild condition,we show that the proposed method converges globally.Numerical experimentation with the new method indicates that it efficiently solves the test problems and therefore is promising.
基金Supported by Research Council of Semnan University
文摘A hybridization of the three–term conjugate gradient method proposed by Zhang et al. and the nonlinear conjugate gradient method proposed by Polak and Ribi`ere, and Polyak is suggested. Based on an eigenvalue analysis, it is shown that search directions of the proposed method satisfy the sufficient descent condition, independent of the line search and the objective function convexity. Global convergence of the method is established under an Armijo–type line search condition. Numerical experiments show practical efficiency of the proposed method.
基金Supported by the National Natural Science Foundation of China (Grant No.10761001)
文摘It is well-known that the direction generated by Hestenes-Stiefel (HS) conjugate gradient method may not be a descent direction for the objective function. In this paper, we take a little modification to the HS method, then the generated direction always satisfies the sufficient descent condition. An advantage of the modified Hestenes-Stiefel (MHS) method is that the scalar βκHS. keeps nonnegative under the weak Wolfe-Powell line search. The global convergence result of the MHS method is established under some mild conditions. Preliminary numerical results show that the MHS method is a little more efficient than PRP and HS methods.
基金the editor and the anonymous referees for their valuable comments,which are helpful to improve the quality of this paper.We would like to thank Professor Dai Y.H.and Dr.Kou C.X.for their CGOPT code,and thank Professors Hager W.W.and Zhang H.C.for their CG_DESCENT(5.3)codeThis research is supported by National Science Foundation of China(No.11901561),China Postdoctoral Science Foundation(2019M660833)Guangxi Natural Science Foundation(No.2018GXNSFBA281180).
文摘A new adaptive subspace minimization three-term conjugate gradient algorithm with nonmonotone line search is introduced and analyzed in this paper.The search directions are computed by minimizing a quadratic approximation of the objective function on special subspaces,and we also proposed an adaptive rule for choosing different searching directions at each iteration.We obtain a significant conclusion that the each choice of the search directions satisfies the sufficient descent condition.With the used nonmonotone line search,we prove that the new algorithm is globally convergent for general nonlinear functions under some mild assumptions.Numerical experiments show that the proposed algorithm is promising for the given test problem set.
基金This Project is supported Supported by the National Natural Science Foundation of China (No.19731001).
文摘Two fundamental convergence theorems are given for nonlinear conjugate gradient methods only under the descent condition. As a result, methods related to the Fletcher-Reeves algorithm still converge for parameters in a slightly wider range, in particular, for a parameter in its upper bound. For methods related to the Polak-Ribiere algorithm, it is shown that some negative values of the conjugate parameter do not prevent convergence. If the objective function is convex, some convergence results hold for the Hestenes-Stiefel algorithm.