期刊文献+
共找到79篇文章
< 1 2 4 >
每页显示 20 50 100
A modified three–term conjugate gradient method with sufficient descent property 被引量:1
1
作者 Saman Babaie–Kafaki 《Applied Mathematics(A Journal of Chinese Universities)》 SCIE CSCD 2015年第3期263-272,共10页
A hybridization of the three–term conjugate gradient method proposed by Zhang et al. and the nonlinear conjugate gradient method proposed by Polak and Ribi`ere, and Polyak is suggested. Based on an eigenvalue analysi... A hybridization of the three–term conjugate gradient method proposed by Zhang et al. and the nonlinear conjugate gradient method proposed by Polak and Ribi`ere, and Polyak is suggested. Based on an eigenvalue analysis, it is shown that search directions of the proposed method satisfy the sufficient descent condition, independent of the line search and the objective function convexity. Global convergence of the method is established under an Armijo–type line search condition. Numerical experiments show practical efficiency of the proposed method. 展开更多
关键词 unconstrained optimization conjugate gradient method EIGENVALUE sufficient descent condition global convergence
下载PDF
High-efciency improved symmetric successive over-relaxation preconditioned conjugate gradient method for solving large-scale finite element linear equations 被引量:1
2
作者 李根 唐春安 李连崇 《Applied Mathematics and Mechanics(English Edition)》 SCIE EI 2013年第10期1225-1236,共12页
Fast solving large-scale linear equations in the finite element analysis is a classical subject in computational mechanics. It is a key technique in computer aided engineering (CAE) and computer aided manufacturing ... Fast solving large-scale linear equations in the finite element analysis is a classical subject in computational mechanics. It is a key technique in computer aided engineering (CAE) and computer aided manufacturing (CAM). This paper presents a high-efficiency improved symmetric successive over-relaxation (ISSOR) preconditioned conjugate gradient (PCG) method, which maintains lelism consistent with the original form. Ideally, the by 50% as compared with the original algorithm. the convergence and inherent paralcomputation can It is suitable for be reduced nearly high-performance computing with its inherent basic high-efficiency operations. By comparing with the numerical results, it is shown that the proposed method has the best performance. 展开更多
关键词 improved preconditioned conjugate gradient (PCG) method conjugate gradient method large-scale linear equation finite element method
下载PDF
A New Nonlinear Conjugate Gradient Method for Unconstrained Optimization Problems 被引量:1
3
作者 LIU Jin-kui WANG Kai-rong +1 位作者 SONG Xiao-qian DU Xiang-lin 《Chinese Quarterly Journal of Mathematics》 CSCD 2010年第3期444-450,共7页
In this paper,an efficient conjugate gradient method is given to solve the general unconstrained optimization problems,which can guarantee the sufficient descent property and the global convergence with the strong Wol... In this paper,an efficient conjugate gradient method is given to solve the general unconstrained optimization problems,which can guarantee the sufficient descent property and the global convergence with the strong Wolfe line search conditions.Numerical results show that the new method is efficient and stationary by comparing with PRP+ method,so it can be widely used in scientific computation. 展开更多
关键词 unconstrained optimization conjugate gradient method strong Wolfe line search sufficient descent property global convergence
下载PDF
Global Convergence of a New Restarting Conjugate Gradient Method for Nonlinear Optimizations 被引量:1
4
作者 SUN Qing-ying(Department of Applied Mathematics, Dalian University of Technology, Dalian 116024, China Department of Applied Mathematics, University of Petroleum , Dongying 257061, China) 《Chinese Quarterly Journal of Mathematics》 CSCD 2003年第2期154-162,共9页
Conjugate gradient optimization algorithms depend on the search directions with different choices for the parameters in the search directions. In this note, by combining the nice numerical performance of PR and HS met... Conjugate gradient optimization algorithms depend on the search directions with different choices for the parameters in the search directions. In this note, by combining the nice numerical performance of PR and HS methods with the global convergence property of the class of conjugate gradient methods presented by HU and STOREY(1991), a class of new restarting conjugate gradient methods is presented. Global convergences of the new method with two kinds of common line searches, are proved. Firstly, it is shown that, using reverse modulus of continuity function and forcing function, the new method for solving unconstrained optimization can work for a continously dif ferentiable function with Curry-Altman's step size rule and a bounded level set. Secondly, by using comparing technique, some general convergence properties of the new method with other kind of step size rule are established. Numerical experiments show that the new method is efficient by comparing with FR conjugate gradient method. 展开更多
关键词 nonlinear programming restarting conjugate gradient method forcing function reverse modulus of continuity function CONVERGENCE
下载PDF
Global Convergence of a New Restarting Three Terms Conjugate Gradient Method for Non-linear Optimizations 被引量:1
5
作者 SUN Qing-ying SANG Zhao-yang TIAN Feng-ting 《Chinese Quarterly Journal of Mathematics》 CSCD 2011年第1期69-76,共8页
In this note,by combining the nice numerical performance of PR and HS methods with the global convergence property of FR method,a class of new restarting three terms conjugate gradient methods is presented.Global conv... In this note,by combining the nice numerical performance of PR and HS methods with the global convergence property of FR method,a class of new restarting three terms conjugate gradient methods is presented.Global convergence properties of the new method with two kinds of common line searches are proved. 展开更多
关键词 nonlinear programming restarting three terms conjugate gradient method forcing function reverse modulus of continuity function convergence
下载PDF
Subspace Minimization Conjugate Gradient Method Based on Cubic Regularization Model for Unconstrained Optimization 被引量:1
6
作者 Ting Zhao Hongwei Liu 《Journal of Harbin Institute of Technology(New Series)》 CAS 2021年第5期61-69,共9页
Many methods have been put forward to solve unconstrained optimization problems,among which conjugate gradient method(CG)is very important.With the increasing emergence of large⁃scale problems,the subspace technology ... Many methods have been put forward to solve unconstrained optimization problems,among which conjugate gradient method(CG)is very important.With the increasing emergence of large⁃scale problems,the subspace technology has become particularly important and widely used in the field of optimization.In this study,a new CG method was put forward,which combined subspace technology and a cubic regularization model.Besides,a special scaled norm in a cubic regularization model was analyzed.Under certain conditions,some significant characteristics of the search direction were given and the convergence of the algorithm was built.Numerical comparisons show that for the 145 test functions under the CUTEr library,the proposed method is better than two classical CG methods and two new subspaces conjugate gradient methods. 展开更多
关键词 cubic regularization model conjugate gradient method subspace technique unconstrained optimization
下载PDF
GLOBAL CONVERGENCE OF THE GENERAL THREE TERM CONJUGATE GRADIENT METHODS WITH THE RELAXED STRONG WOLFE LINE SEARCH
7
作者 Xu Zeshui Yue ZhenjunInstitute of Sciences,PLA University of Science and Technology,Nanjing,210016. 《Applied Mathematics(A Journal of Chinese Universities)》 SCIE CSCD 2001年第1期58-62,共5页
The global convergence of the general three term conjugate gradient methods with the relaxed strong Wolfe line search is proved.
关键词 conjugate gradient method inexact line search global convergence.
下载PDF
IMPROVED PRECONDITIONED CONJUGATE GRADIENT METHOD AND ITS APPLICATION IN F.E.A.FOR ENGINEERING
8
作者 郑宏 葛修润 《Applied Mathematics and Mechanics(English Edition)》 SCIE EI 1993年第4期371-380,共10页
In this paper two theorems with theoretical and practical significance are given in respect to the preconditioned conjugate gradient method (PCCG). The theorems discuss respectively the qualitative property of the ite... In this paper two theorems with theoretical and practical significance are given in respect to the preconditioned conjugate gradient method (PCCG). The theorems discuss respectively the qualitative property of the iterative solution and the construction principle of the iterative matrix. The authors put forward a new incompletely LU factorizing technique for non-M-matrix and the method of constructing the iterative matrix. This improved PCCG is used to calculate the ill-conditioned problems and large-scale three-dimensional finite element problems, and simultaneously contrasted with other methods. The abnormal phenomenon is analyzed when PCCG is used to solve the system of ill-conditioned equations, ft is shown that the method proposed in this paper is quite effective in solving the system of large-scale finite element equations and the system of ill-conditioned equations. 展开更多
关键词 preconditioned conjugate gradient method finite element ill-conditioned problems
下载PDF
The Conjugate Gradient Method in Random Variables
9
作者 HSU Ming-hsiu LAI King-fai 《Chinese Quarterly Journal of Mathematics》 2021年第2期111-121,共11页
We study the conjugate gradient method for solving a system of linear equations with coefficients which are measurable functions and establish the rate of convergence of this method.
关键词 Riesz algebra MATRICES Measurable functions Ordered structures conjugate gradient methods Computational methods in function algebras
下载PDF
A CLASSOF NONMONOTONE CONJUGATE GRADIENT METHODSFOR NONCONVEX FUNCTIONS
10
作者 LiuYun WeiZengxin 《Applied Mathematics(A Journal of Chinese Universities)》 SCIE CSCD 2002年第2期208-214,共7页
This paper discusses the global convergence of a class of nonmonotone conjugate gra- dient methods(NM methods) for nonconvex object functions.This class of methods includes the nonmonotone counterpart of modified Po... This paper discusses the global convergence of a class of nonmonotone conjugate gra- dient methods(NM methods) for nonconvex object functions.This class of methods includes the nonmonotone counterpart of modified Polak- Ribière method and modified Hestenes- Stiefel method as special cases 展开更多
关键词 nonmonotone conjugate gradient method nonmonotone line search global convergence unconstrained optimization.
下载PDF
CONVERGENCE ANALYSIS ON A CLASS OF CONJUGATE GRADIENT METHODS WITHOUTSUFFICIENT DECREASE CONDITION 被引量:1
11
作者 刘光辉 韩继业 +1 位作者 戚厚铎 徐中玲 《Acta Mathematica Scientia》 SCIE CSCD 1998年第1期11-16,共6页
Recently, Gilbert and Nocedal([3]) investigated global convergence of conjugate gradient methods related to Polak-Ribiere formular, they restricted beta(k) to non-negative value. [5] discussed the same problem as that... Recently, Gilbert and Nocedal([3]) investigated global convergence of conjugate gradient methods related to Polak-Ribiere formular, they restricted beta(k) to non-negative value. [5] discussed the same problem as that in [3] and relaxed beta(k) to be negative with the objective function being convex. This paper allows beta(k) to be selected in a wider range than [5]. Especially, the global convergence of the corresponding algorithm without sufficient decrease condition is proved. 展开更多
关键词 Polak-Ribiere conjugate gradient method strong Wolfe line search global convergence
全文增补中
A New Two-Parameter Family of Nonlinear Conjugate Gradient Method Without Line Search for Unconstrained Optimization Problem
12
作者 ZHU Tiefeng 《Wuhan University Journal of Natural Sciences》 CAS CSCD 2024年第5期403-411,共9页
This paper puts forward a two-parameter family of nonlinear conjugate gradient(CG)method without line search for solving unconstrained optimization problem.The main feature of this method is that it does not rely on a... This paper puts forward a two-parameter family of nonlinear conjugate gradient(CG)method without line search for solving unconstrained optimization problem.The main feature of this method is that it does not rely on any line search and only requires a simple step size formula to always generate a sufficient descent direction.Under certain assumptions,the proposed method is proved to possess global convergence.Finally,our method is compared with other potential methods.A large number of numerical experiments show that our method is more competitive and effective. 展开更多
关键词 unconstrained optimization conjugate gradient method without line search global convergence
原文传递
An Adaptive Spectral Conjugate Gradient Method with Restart Strategy
13
作者 Zhou Jincheng Jiang Meixuan +2 位作者 Zhong Zining Wu Yanqiang Shao Hu 《数学理论与应用》 2024年第3期106-118,共13页
As a generalization of the two-term conjugate gradient method(CGM),the spectral CGM is one of the effective methods for solving unconstrained optimization.In this paper,we enhance the JJSL conjugate parameter,initiall... As a generalization of the two-term conjugate gradient method(CGM),the spectral CGM is one of the effective methods for solving unconstrained optimization.In this paper,we enhance the JJSL conjugate parameter,initially proposed by Jiang et al.(Computational and Applied Mathematics,2021,40:174),through the utilization of a convex combination technique.And this improvement allows for an adaptive search direction by integrating a newly constructed spectral gradient-type restart strategy.Then,we develop a new spectral CGM by employing an inexact line search to determine the step size.With the application of the weak Wolfe line search,we establish the sufficient descent property of the proposed search direction.Moreover,under general assumptions,including the employment of the strong Wolfe line search for step size calculation,we demonstrate the global convergence of our new algorithm.Finally,the given unconstrained optimization test results show that the new algorithm is effective. 展开更多
关键词 Unconstrained optimization Spectral conjugate gradient method Restart strategy Inexact line search Global convergence
下载PDF
Full waveform inversion with spectral conjugategradient method
14
作者 LIU Xiao LIU Mingchen +1 位作者 SUN Hui WANG Qianlong 《Global Geology》 2017年第1期40-45,共6页
Spectral conjugate gradient method is an algorithm obtained by combination of spectral gradient method and conjugate gradient method,which is characterized with global convergence and simplicity of spectral gradient m... Spectral conjugate gradient method is an algorithm obtained by combination of spectral gradient method and conjugate gradient method,which is characterized with global convergence and simplicity of spectral gradient method,and small storage of conjugate gradient method.Besides,the spectral conjugate gradient method was proved that the search direction at each iteration is a descent direction of objective function even without relying on any line search method.Spectral conjugate gradient method is applied to full waveform inversion for numerical tests on Marmousi model.The authors give a comparison on numerical results obtained by steepest descent method,conjugate gradient method and spectral conjugate gradient method,which shows that the spectral conjugate gradient method is superior to the other two methods. 展开更多
关键词 ful l waveform inversion spectral conjugate gradient method conjugate gradient method steepest descent method
下载PDF
A Barzilai-Borwein conjugate gradient method 被引量:7
15
作者 DAI YuHong KOU CaiXia 《Science China Mathematics》 SCIE CSCD 2016年第8期1511-1524,共14页
The linear conjugate gradient method is an optimal method for convex quadratic minimization due to the Krylov subspace minimization property. The proposition of limited-memory BFGS method and Barzilai-Borwein gradient... The linear conjugate gradient method is an optimal method for convex quadratic minimization due to the Krylov subspace minimization property. The proposition of limited-memory BFGS method and Barzilai-Borwein gradient method, however, heavily restricted the use of conjugate gradient method for largescale nonlinear optimization. This is, to the great extent, due to the requirement of a relatively exact line search at each iteration and the loss of conjugacy property of the search directions in various occasions. On the contrary, the limited-memory BFGS method and the Barzilai-Bowein gradient method share the so-called asymptotical one stepsize per line-search property, namely, the trial stepsize in the method will asymptotically be accepted by the line search when the iteration is close to the solution. This paper will focus on the analysis of the subspace minimization conjugate gradient method by Yuan and Stoer(1995). Specifically, if choosing the parameter in the method by combining the Barzilai-Borwein idea, we will be able to provide some efficient Barzilai-Borwein conjugate gradient(BBCG) methods. The initial numerical experiments show that one of the variants, BBCG3, is specially efficient among many others without line searches. This variant of the BBCG method might enjoy the asymptotical one stepsize per line-search property and become a strong candidate for large-scale nonlinear optimization. 展开更多
关键词 conjugate gradient method subspace minimization Barzilai-Bowein gradient method line search descent property global convergence
原文传递
TESTING DIFFERENT CONJUGATE GRADIENT METHODS FOR LARGE-SCALE UNCONSTRAINED OPTIMIZATION 被引量:10
16
作者 Yu-hongDai QinNi 《Journal of Computational Mathematics》 SCIE CSCD 2003年第3期311-320,共10页
In this paper we test different conjugate gradient (CG) methods for solving large-scale unconstrained optimization problems. The methods are divided in two groups: the first group includes five basic CG methods and th... In this paper we test different conjugate gradient (CG) methods for solving large-scale unconstrained optimization problems. The methods are divided in two groups: the first group includes five basic CG methods and the second five hybrid CG methods. A collection of medium-scale and large-scale test problems are drawn from a standard code of test problems, CUTE. The conjugate gradient methods are ranked according to the numerical results. Some remarks are given. 展开更多
关键词 conjugate gradient methods LARGE-SCALE Unconstrained optimization Numerical tests.
原文传递
A Modified Hestenes-Stiefel Conjugate Gradient Method and Its Convergence 被引量:9
17
作者 Zeng Xin WEI Hai Dong HUANG Yah Rong TAO 《Journal of Mathematical Research and Exposition》 CSCD 2010年第2期297-308,共12页
It is well-known that the direction generated by Hestenes-Stiefel (HS) conjugate gradient method may not be a descent direction for the objective function. In this paper, we take a little modification to the HS meth... It is well-known that the direction generated by Hestenes-Stiefel (HS) conjugate gradient method may not be a descent direction for the objective function. In this paper, we take a little modification to the HS method, then the generated direction always satisfies the sufficient descent condition. An advantage of the modified Hestenes-Stiefel (MHS) method is that the scalar βκHS. keeps nonnegative under the weak Wolfe-Powell line search. The global convergence result of the MHS method is established under some mild conditions. Preliminary numerical results show that the MHS method is a little more efficient than PRP and HS methods. 展开更多
关键词 conjugate gradient method sufficient descent condition line search global convergence.
下载PDF
THE RESTRICTIVELY PRECONDITIONED CONJUGATE GRADIENT METHODS ON NORMAL RESIDUAL FOR BLOCK TWO-BY-TWO LINEAR SYSTEMS 被引量:4
18
作者 Junfeng Yin Zhongzhi Bai 《Journal of Computational Mathematics》 SCIE EI CSCD 2008年第2期240-249,共10页
The restrictively preconditioned conjugate gradient (RPCG) method is further developed to solve large sparse system of linear equations of a block two-by-two structure. The basic idea of this new approach is that we... The restrictively preconditioned conjugate gradient (RPCG) method is further developed to solve large sparse system of linear equations of a block two-by-two structure. The basic idea of this new approach is that we apply the RPCG method to the normal-residual equation of the block two-by-two linear system and construct each required approximate matrix by making use of the incomplete orthogonal factorization of the involved matrix blocks. Numerical experiments show that the new method, called the restrictively preconditioned conjugate gradient on normal residual (RPCGNR), is more robust and effective than either the known RPCG method or the standard conjugate gradient on normal residual (CGNR) method when being used for solving the large sparse saddle point problems. 展开更多
关键词 Block two-by-two linear system Saddle point problem Restrictively preconditioned conjugate gradient method Normal-residual equation Incomplete orthogonal factorization
原文传递
Global Convergence of a Modified PRP Conjugate Gradient Method 被引量:10
19
作者 Hai Dong HUANG Yan Jun LI Zeng Xin WEI 《Journal of Mathematical Research and Exposition》 CSCD 2010年第1期141-148,共8页
In this paper, a modified formula for βk^PRP is proposed for the conjugate gradient method of solving unconstrained optimization problems. The value of βk^PRP keeps nonnegative independent of the line search. Under ... In this paper, a modified formula for βk^PRP is proposed for the conjugate gradient method of solving unconstrained optimization problems. The value of βk^PRP keeps nonnegative independent of the line search. Under mild conditions, the global convergence of modified PRP method with the strong Wolfe-Powell line search is established. Preliminary numerical results show that the modified method is efficient. 展开更多
关键词 unconstrained optimization conjugate gradient method global convergence.
下载PDF
Global Convergence of a Modified Spectral CD Conjugate Gradient Method 被引量:7
20
作者 Wei CAO Kai Rong WANG Yi Li WANG 《Journal of Mathematical Research and Exposition》 CSCD 2011年第2期261-268,共8页
In this paper,we present a new nonlinear modified spectral CD conjugate gradient method for solving large scale unconstrained optimization problems.The direction generated by the method is a descent direction for the ... In this paper,we present a new nonlinear modified spectral CD conjugate gradient method for solving large scale unconstrained optimization problems.The direction generated by the method is a descent direction for the objective function,and this property depends neither on the line search rule,nor on the convexity of the objective function.Moreover,the modified method reduces to the standard CD method if line search is exact.Under some mild conditions,we prove that the modified method with line search is globally convergent even if the objective function is nonconvex.Preliminary numerical results show that the proposed method is very promising. 展开更多
关键词 unconstrained optimization conjugate gradient method armijo-type line search global convergence
下载PDF
上一页 1 2 4 下一页 到第
使用帮助 返回顶部