期刊文献+
共找到4篇文章
< 1 >
每页显示 20 50 100
A New Nonlinear Conjugate Gradient Method for Unconstrained Optimization Problems 被引量:1
1
作者 LIU Jin-kui WANG Kai-rong +1 位作者 SONG Xiao-qian DU Xiang-lin 《Chinese Quarterly Journal of Mathematics》 CSCD 2010年第3期444-450,共7页
In this paper,an efficient conjugate gradient method is given to solve the general unconstrained optimization problems,which can guarantee the sufficient descent property and the global convergence with the strong Wol... In this paper,an efficient conjugate gradient method is given to solve the general unconstrained optimization problems,which can guarantee the sufficient descent property and the global convergence with the strong Wolfe line search conditions.Numerical results show that the new method is efficient and stationary by comparing with PRP+ method,so it can be widely used in scientific computation. 展开更多
关键词 unconstrained optimization conjugate gradient method strong wolfe line search sufficient descent property global convergence
下载PDF
A Descent Gradient Method and Its Global Convergence
2
作者 LIU Jin-kui 《Chinese Quarterly Journal of Mathematics》 CSCD 2014年第1期142-150,共9页
Y Liu and C Storey(1992)proposed the famous LS conjugate gradient method which has good numerical results.However,the LS method has very weak convergence under the Wolfe-type line search.In this paper,we give a new de... Y Liu and C Storey(1992)proposed the famous LS conjugate gradient method which has good numerical results.However,the LS method has very weak convergence under the Wolfe-type line search.In this paper,we give a new descent gradient method based on the LS method.It can guarantee the sufficient descent property at each iteration and the global convergence under the strong Wolfe line search.Finally,we also present extensive preliminary numerical experiments to show the efficiency of the proposed method by comparing with the famous PRP^+method. 展开更多
关键词 unconstrained optimization conjugate gradient method strong wolfe line search sufficient descent property global convergence
下载PDF
CONVERGENCE ANALYSIS ON A CLASS OF CONJUGATE GRADIENT METHODS WITHOUTSUFFICIENT DECREASE CONDITION 被引量:1
3
作者 刘光辉 韩继业 +1 位作者 戚厚铎 徐中玲 《Acta Mathematica Scientia》 SCIE CSCD 1998年第1期11-16,共6页
Recently, Gilbert and Nocedal([3]) investigated global convergence of conjugate gradient methods related to Polak-Ribiere formular, they restricted beta(k) to non-negative value. [5] discussed the same problem as that... Recently, Gilbert and Nocedal([3]) investigated global convergence of conjugate gradient methods related to Polak-Ribiere formular, they restricted beta(k) to non-negative value. [5] discussed the same problem as that in [3] and relaxed beta(k) to be negative with the objective function being convex. This paper allows beta(k) to be selected in a wider range than [5]. Especially, the global convergence of the corresponding algorithm without sufficient decrease condition is proved. 展开更多
关键词 Polak-Ribiere conjugate gradient method strong wolfe line search global convergence
全文增补中
Derivation and Global Convergence for Memoryless Non-quasi-Newton Method
4
作者 JIAO Bao Cong YU Jing Jing CHEN Lan Ping 《Journal of Mathematical Research and Exposition》 CSCD 2009年第3期423-433,共11页
In this paper, a new class of memoryless non-quasi-Newton method for solving unconstrained optimization problems is proposed, and the global convergence of this method with inexact line search is proved. Furthermore, ... In this paper, a new class of memoryless non-quasi-Newton method for solving unconstrained optimization problems is proposed, and the global convergence of this method with inexact line search is proved. Furthermore, we propose a hybrid method that mixes both the memoryless non-quasi-Newton method and the memoryless Perry-Shanno quasi-Newton method. The global convergence of this hybrid memoryless method is proved under mild assumptions. The initial results show that these new methods are efficient for the given test problems. Especially the memoryless non-quasi-Newton method requires little storage and computation, so it is able to efficiently solve large scale optimization problems. 展开更多
关键词 memoryless non-quasi-Newton method wolfe line search global convergence.
下载PDF
上一页 1 下一页 到第
使用帮助 返回顶部