期刊文献+
共找到5篇文章
< 1 >
每页显示 20 50 100
An Adaptive Three-Term Conjugate Gradient Method with Sufficient Descent Condition and Conjugacy Condition
1
作者 Xiao-Liang Dong Zhi-Feng Dai +1 位作者 Reza Ghanbari Xiang-Li Li 《Journal of the Operations Research Society of China》 EI CSCD 2021年第2期411-425,共15页
In this paper,an adaptive three-term conjugate gradient method is proposed for solving unconstrained problems,which generates sufficient descent directions at each iteration.Different from the existent methods,a dynam... In this paper,an adaptive three-term conjugate gradient method is proposed for solving unconstrained problems,which generates sufficient descent directions at each iteration.Different from the existent methods,a dynamical adjustment between Hestenes–Stiefel and Dai–Liao conjugacy conditions in our proposed method is developed.Under mild condition,we show that the proposed method converges globally.Numerical experimentation with the new method indicates that it efficiently solves the test problems and therefore is promising. 展开更多
关键词 Three-term conjugate gradient method Sufficient descent condition Conjugacy condition Global convergence
原文传递
A modified three–term conjugate gradient method with sufficient descent property 被引量:1
2
作者 Saman Babaie–Kafaki 《Applied Mathematics(A Journal of Chinese Universities)》 SCIE CSCD 2015年第3期263-272,共10页
A hybridization of the three–term conjugate gradient method proposed by Zhang et al. and the nonlinear conjugate gradient method proposed by Polak and Ribi`ere, and Polyak is suggested. Based on an eigenvalue analysi... A hybridization of the three–term conjugate gradient method proposed by Zhang et al. and the nonlinear conjugate gradient method proposed by Polak and Ribi`ere, and Polyak is suggested. Based on an eigenvalue analysis, it is shown that search directions of the proposed method satisfy the sufficient descent condition, independent of the line search and the objective function convexity. Global convergence of the method is established under an Armijo–type line search condition. Numerical experiments show practical efficiency of the proposed method. 展开更多
关键词 unconstrained optimization conjugate gradient method EIGENVALUE sufficient descent condition global convergence
下载PDF
A Modified Hestenes-Stiefel Conjugate Gradient Method and Its Convergence 被引量:9
3
作者 Zeng Xin WEI Hai Dong HUANG Yah Rong TAO 《Journal of Mathematical Research and Exposition》 CSCD 2010年第2期297-308,共12页
It is well-known that the direction generated by Hestenes-Stiefel (HS) conjugate gradient method may not be a descent direction for the objective function. In this paper, we take a little modification to the HS meth... It is well-known that the direction generated by Hestenes-Stiefel (HS) conjugate gradient method may not be a descent direction for the objective function. In this paper, we take a little modification to the HS method, then the generated direction always satisfies the sufficient descent condition. An advantage of the modified Hestenes-Stiefel (MHS) method is that the scalar βκHS. keeps nonnegative under the weak Wolfe-Powell line search. The global convergence result of the MHS method is established under some mild conditions. Preliminary numerical results show that the MHS method is a little more efficient than PRP and HS methods. 展开更多
关键词 conjugate gradient method sufficient descent condition line search global convergence.
下载PDF
A NEW ADAPTIVE SUBSPACE MINIMIZATION THREE-TERM CONJUGATE GRADIENT ALGORITHM FOR UNCONSTRAINED OPTIMIZATION
4
作者 Keke Zhang Hongwei Liu Zexian Liu 《Journal of Computational Mathematics》 SCIE CSCD 2021年第2期159-177,共19页
A new adaptive subspace minimization three-term conjugate gradient algorithm with nonmonotone line search is introduced and analyzed in this paper.The search directions are computed by minimizing a quadratic approxima... A new adaptive subspace minimization three-term conjugate gradient algorithm with nonmonotone line search is introduced and analyzed in this paper.The search directions are computed by minimizing a quadratic approximation of the objective function on special subspaces,and we also proposed an adaptive rule for choosing different searching directions at each iteration.We obtain a significant conclusion that the each choice of the search directions satisfies the sufficient descent condition.With the used nonmonotone line search,we prove that the new algorithm is globally convergent for general nonlinear functions under some mild assumptions.Numerical experiments show that the proposed algorithm is promising for the given test problem set. 展开更多
关键词 Conjugate gradient method Nonmonotone line search Subspace minimization Sufficient descent condition Global convergence
原文传递
TWO FUNDAMENTAL CONVERGENCE THEOREMS FOR NONLINEAR CONJUGATE GRADIENT METHODS AND THEIR APPLICATIONS 被引量:1
5
作者 韩继业 刘光辉 +1 位作者 孙德锋 尹红霞 《Acta Mathematicae Applicatae Sinica》 SCIE CSCD 2001年第1期38-46,共9页
Two fundamental convergence theorems are given for nonlinear conjugate gradient methods only under the descent condition. As a result, methods related to the Fletcher-Reeves algorithm still converge for parameters in ... Two fundamental convergence theorems are given for nonlinear conjugate gradient methods only under the descent condition. As a result, methods related to the Fletcher-Reeves algorithm still converge for parameters in a slightly wider range, in particular, for a parameter in its upper bound. For methods related to the Polak-Ribiere algorithm, it is shown that some negative values of the conjugate parameter do not prevent convergence. If the objective function is convex, some convergence results hold for the Hestenes-Stiefel algorithm. 展开更多
关键词 Conjugate gradient method descent condition global convergence
全文增补中
上一页 1 下一页 到第
使用帮助 返回顶部