This paper puts forward a two-parameter family of nonlinear conjugate gradient(CG)method without line search for solving unconstrained optimization problem.The main feature of this method is that it does not rely on a...This paper puts forward a two-parameter family of nonlinear conjugate gradient(CG)method without line search for solving unconstrained optimization problem.The main feature of this method is that it does not rely on any line search and only requires a simple step size formula to always generate a sufficient descent direction.Under certain assumptions,the proposed method is proved to possess global convergence.Finally,our method is compared with other potential methods.A large number of numerical experiments show that our method is more competitive and effective.展开更多
As a generalization of the two-term conjugate gradient method(CGM),the spectral CGM is one of the effective methods for solving unconstrained optimization.In this paper,we enhance the JJSL conjugate parameter,initiall...As a generalization of the two-term conjugate gradient method(CGM),the spectral CGM is one of the effective methods for solving unconstrained optimization.In this paper,we enhance the JJSL conjugate parameter,initially proposed by Jiang et al.(Computational and Applied Mathematics,2021,40:174),through the utilization of a convex combination technique.And this improvement allows for an adaptive search direction by integrating a newly constructed spectral gradient-type restart strategy.Then,we develop a new spectral CGM by employing an inexact line search to determine the step size.With the application of the weak Wolfe line search,we establish the sufficient descent property of the proposed search direction.Moreover,under general assumptions,including the employment of the strong Wolfe line search for step size calculation,we demonstrate the global convergence of our new algorithm.Finally,the given unconstrained optimization test results show that the new algorithm is effective.展开更多
A hybridization of the three–term conjugate gradient method proposed by Zhang et al. and the nonlinear conjugate gradient method proposed by Polak and Ribi`ere, and Polyak is suggested. Based on an eigenvalue analysi...A hybridization of the three–term conjugate gradient method proposed by Zhang et al. and the nonlinear conjugate gradient method proposed by Polak and Ribi`ere, and Polyak is suggested. Based on an eigenvalue analysis, it is shown that search directions of the proposed method satisfy the sufficient descent condition, independent of the line search and the objective function convexity. Global convergence of the method is established under an Armijo–type line search condition. Numerical experiments show practical efficiency of the proposed method.展开更多
Fast solving large-scale linear equations in the finite element analysis is a classical subject in computational mechanics. It is a key technique in computer aided engineering (CAE) and computer aided manufacturing ...Fast solving large-scale linear equations in the finite element analysis is a classical subject in computational mechanics. It is a key technique in computer aided engineering (CAE) and computer aided manufacturing (CAM). This paper presents a high-efficiency improved symmetric successive over-relaxation (ISSOR) preconditioned conjugate gradient (PCG) method, which maintains lelism consistent with the original form. Ideally, the by 50% as compared with the original algorithm. the convergence and inherent paralcomputation can It is suitable for be reduced nearly high-performance computing with its inherent basic high-efficiency operations. By comparing with the numerical results, it is shown that the proposed method has the best performance.展开更多
In this paper,an efficient conjugate gradient method is given to solve the general unconstrained optimization problems,which can guarantee the sufficient descent property and the global convergence with the strong Wol...In this paper,an efficient conjugate gradient method is given to solve the general unconstrained optimization problems,which can guarantee the sufficient descent property and the global convergence with the strong Wolfe line search conditions.Numerical results show that the new method is efficient and stationary by comparing with PRP+ method,so it can be widely used in scientific computation.展开更多
Conjugate gradient optimization algorithms depend on the search directions with different choices for the parameters in the search directions. In this note, by combining the nice numerical performance of PR and HS met...Conjugate gradient optimization algorithms depend on the search directions with different choices for the parameters in the search directions. In this note, by combining the nice numerical performance of PR and HS methods with the global convergence property of the class of conjugate gradient methods presented by HU and STOREY(1991), a class of new restarting conjugate gradient methods is presented. Global convergences of the new method with two kinds of common line searches, are proved. Firstly, it is shown that, using reverse modulus of continuity function and forcing function, the new method for solving unconstrained optimization can work for a continously dif ferentiable function with Curry-Altman's step size rule and a bounded level set. Secondly, by using comparing technique, some general convergence properties of the new method with other kind of step size rule are established. Numerical experiments show that the new method is efficient by comparing with FR conjugate gradient method.展开更多
In this note,by combining the nice numerical performance of PR and HS methods with the global convergence property of FR method,a class of new restarting three terms conjugate gradient methods is presented.Global conv...In this note,by combining the nice numerical performance of PR and HS methods with the global convergence property of FR method,a class of new restarting three terms conjugate gradient methods is presented.Global convergence properties of the new method with two kinds of common line searches are proved.展开更多
Many methods have been put forward to solve unconstrained optimization problems,among which conjugate gradient method(CG)is very important.With the increasing emergence of large⁃scale problems,the subspace technology ...Many methods have been put forward to solve unconstrained optimization problems,among which conjugate gradient method(CG)is very important.With the increasing emergence of large⁃scale problems,the subspace technology has become particularly important and widely used in the field of optimization.In this study,a new CG method was put forward,which combined subspace technology and a cubic regularization model.Besides,a special scaled norm in a cubic regularization model was analyzed.Under certain conditions,some significant characteristics of the search direction were given and the convergence of the algorithm was built.Numerical comparisons show that for the 145 test functions under the CUTEr library,the proposed method is better than two classical CG methods and two new subspaces conjugate gradient methods.展开更多
In this paper two theorems with theoretical and practical significance are given in respect to the preconditioned conjugate gradient method (PCCG). The theorems discuss respectively the qualitative property of the ite...In this paper two theorems with theoretical and practical significance are given in respect to the preconditioned conjugate gradient method (PCCG). The theorems discuss respectively the qualitative property of the iterative solution and the construction principle of the iterative matrix. The authors put forward a new incompletely LU factorizing technique for non-M-matrix and the method of constructing the iterative matrix. This improved PCCG is used to calculate the ill-conditioned problems and large-scale three-dimensional finite element problems, and simultaneously contrasted with other methods. The abnormal phenomenon is analyzed when PCCG is used to solve the system of ill-conditioned equations, ft is shown that the method proposed in this paper is quite effective in solving the system of large-scale finite element equations and the system of ill-conditioned equations.展开更多
We study the conjugate gradient method for solving a system of linear equations with coefficients which are measurable functions and establish the rate of convergence of this method.
This paper discusses the global convergence of a class of nonmonotone conjugate gra- dient methods(NM methods) for nonconvex object functions.This class of methods includes the nonmonotone counterpart of modified Po...This paper discusses the global convergence of a class of nonmonotone conjugate gra- dient methods(NM methods) for nonconvex object functions.This class of methods includes the nonmonotone counterpart of modified Polak- Ribière method and modified Hestenes- Stiefel method as special cases展开更多
Recently, Gilbert and Nocedal([3]) investigated global convergence of conjugate gradient methods related to Polak-Ribiere formular, they restricted beta(k) to non-negative value. [5] discussed the same problem as that...Recently, Gilbert and Nocedal([3]) investigated global convergence of conjugate gradient methods related to Polak-Ribiere formular, they restricted beta(k) to non-negative value. [5] discussed the same problem as that in [3] and relaxed beta(k) to be negative with the objective function being convex. This paper allows beta(k) to be selected in a wider range than [5]. Especially, the global convergence of the corresponding algorithm without sufficient decrease condition is proved.展开更多
Spectral conjugate gradient method is an algorithm obtained by combination of spectral gradient method and conjugate gradient method,which is characterized with global convergence and simplicity of spectral gradient m...Spectral conjugate gradient method is an algorithm obtained by combination of spectral gradient method and conjugate gradient method,which is characterized with global convergence and simplicity of spectral gradient method,and small storage of conjugate gradient method.Besides,the spectral conjugate gradient method was proved that the search direction at each iteration is a descent direction of objective function even without relying on any line search method.Spectral conjugate gradient method is applied to full waveform inversion for numerical tests on Marmousi model.The authors give a comparison on numerical results obtained by steepest descent method,conjugate gradient method and spectral conjugate gradient method,which shows that the spectral conjugate gradient method is superior to the other two methods.展开更多
The linear conjugate gradient method is an optimal method for convex quadratic minimization due to the Krylov subspace minimization property. The proposition of limited-memory BFGS method and Barzilai-Borwein gradient...The linear conjugate gradient method is an optimal method for convex quadratic minimization due to the Krylov subspace minimization property. The proposition of limited-memory BFGS method and Barzilai-Borwein gradient method, however, heavily restricted the use of conjugate gradient method for largescale nonlinear optimization. This is, to the great extent, due to the requirement of a relatively exact line search at each iteration and the loss of conjugacy property of the search directions in various occasions. On the contrary, the limited-memory BFGS method and the Barzilai-Bowein gradient method share the so-called asymptotical one stepsize per line-search property, namely, the trial stepsize in the method will asymptotically be accepted by the line search when the iteration is close to the solution. This paper will focus on the analysis of the subspace minimization conjugate gradient method by Yuan and Stoer(1995). Specifically, if choosing the parameter in the method by combining the Barzilai-Borwein idea, we will be able to provide some efficient Barzilai-Borwein conjugate gradient(BBCG) methods. The initial numerical experiments show that one of the variants, BBCG3, is specially efficient among many others without line searches. This variant of the BBCG method might enjoy the asymptotical one stepsize per line-search property and become a strong candidate for large-scale nonlinear optimization.展开更多
In this paper we test different conjugate gradient (CG) methods for solving large-scale unconstrained optimization problems. The methods are divided in two groups: the first group includes five basic CG methods and th...In this paper we test different conjugate gradient (CG) methods for solving large-scale unconstrained optimization problems. The methods are divided in two groups: the first group includes five basic CG methods and the second five hybrid CG methods. A collection of medium-scale and large-scale test problems are drawn from a standard code of test problems, CUTE. The conjugate gradient methods are ranked according to the numerical results. Some remarks are given.展开更多
It is well-known that the direction generated by Hestenes-Stiefel (HS) conjugate gradient method may not be a descent direction for the objective function. In this paper, we take a little modification to the HS meth...It is well-known that the direction generated by Hestenes-Stiefel (HS) conjugate gradient method may not be a descent direction for the objective function. In this paper, we take a little modification to the HS method, then the generated direction always satisfies the sufficient descent condition. An advantage of the modified Hestenes-Stiefel (MHS) method is that the scalar βκHS. keeps nonnegative under the weak Wolfe-Powell line search. The global convergence result of the MHS method is established under some mild conditions. Preliminary numerical results show that the MHS method is a little more efficient than PRP and HS methods.展开更多
The restrictively preconditioned conjugate gradient (RPCG) method is further developed to solve large sparse system of linear equations of a block two-by-two structure. The basic idea of this new approach is that we...The restrictively preconditioned conjugate gradient (RPCG) method is further developed to solve large sparse system of linear equations of a block two-by-two structure. The basic idea of this new approach is that we apply the RPCG method to the normal-residual equation of the block two-by-two linear system and construct each required approximate matrix by making use of the incomplete orthogonal factorization of the involved matrix blocks. Numerical experiments show that the new method, called the restrictively preconditioned conjugate gradient on normal residual (RPCGNR), is more robust and effective than either the known RPCG method or the standard conjugate gradient on normal residual (CGNR) method when being used for solving the large sparse saddle point problems.展开更多
In this paper, a modified formula for βk^PRP is proposed for the conjugate gradient method of solving unconstrained optimization problems. The value of βk^PRP keeps nonnegative independent of the line search. Under ...In this paper, a modified formula for βk^PRP is proposed for the conjugate gradient method of solving unconstrained optimization problems. The value of βk^PRP keeps nonnegative independent of the line search. Under mild conditions, the global convergence of modified PRP method with the strong Wolfe-Powell line search is established. Preliminary numerical results show that the modified method is efficient.展开更多
In this paper,we present a new nonlinear modified spectral CD conjugate gradient method for solving large scale unconstrained optimization problems.The direction generated by the method is a descent direction for the ...In this paper,we present a new nonlinear modified spectral CD conjugate gradient method for solving large scale unconstrained optimization problems.The direction generated by the method is a descent direction for the objective function,and this property depends neither on the line search rule,nor on the convexity of the objective function.Moreover,the modified method reduces to the standard CD method if line search is exact.Under some mild conditions,we prove that the modified method with line search is globally convergent even if the objective function is nonconvex.Preliminary numerical results show that the proposed method is very promising.展开更多
基金Supported by 2023 Inner Mongolia University of Finance and Economics,General Scientific Research for Universities directly under Inner Mon‐golia,China (NCYWT23026)2024 High-quality Research Achievements Cultivation Fund Project of Inner Mongolia University of Finance and Economics,China (GZCG2479)。
文摘This paper puts forward a two-parameter family of nonlinear conjugate gradient(CG)method without line search for solving unconstrained optimization problem.The main feature of this method is that it does not rely on any line search and only requires a simple step size formula to always generate a sufficient descent direction.Under certain assumptions,the proposed method is proved to possess global convergence.Finally,our method is compared with other potential methods.A large number of numerical experiments show that our method is more competitive and effective.
基金supported by the National Natural Science Foundation of China (No.72071202)the Key Laboratory of Mathematics and Engineering ApplicationsMinistry of Education。
文摘As a generalization of the two-term conjugate gradient method(CGM),the spectral CGM is one of the effective methods for solving unconstrained optimization.In this paper,we enhance the JJSL conjugate parameter,initially proposed by Jiang et al.(Computational and Applied Mathematics,2021,40:174),through the utilization of a convex combination technique.And this improvement allows for an adaptive search direction by integrating a newly constructed spectral gradient-type restart strategy.Then,we develop a new spectral CGM by employing an inexact line search to determine the step size.With the application of the weak Wolfe line search,we establish the sufficient descent property of the proposed search direction.Moreover,under general assumptions,including the employment of the strong Wolfe line search for step size calculation,we demonstrate the global convergence of our new algorithm.Finally,the given unconstrained optimization test results show that the new algorithm is effective.
基金Supported by Research Council of Semnan University
文摘A hybridization of the three–term conjugate gradient method proposed by Zhang et al. and the nonlinear conjugate gradient method proposed by Polak and Ribi`ere, and Polyak is suggested. Based on an eigenvalue analysis, it is shown that search directions of the proposed method satisfy the sufficient descent condition, independent of the line search and the objective function convexity. Global convergence of the method is established under an Armijo–type line search condition. Numerical experiments show practical efficiency of the proposed method.
基金Project supported by the National Natural Science Foundation of China(Nos.5130926141030747+3 种基金41102181and 51121005)the National Basic Research Program of China(973 Program)(No.2011CB013503)the Young Teachers’ Initial Funding Scheme of Sun Yat-sen University(No.39000-1188140)
文摘Fast solving large-scale linear equations in the finite element analysis is a classical subject in computational mechanics. It is a key technique in computer aided engineering (CAE) and computer aided manufacturing (CAM). This paper presents a high-efficiency improved symmetric successive over-relaxation (ISSOR) preconditioned conjugate gradient (PCG) method, which maintains lelism consistent with the original form. Ideally, the by 50% as compared with the original algorithm. the convergence and inherent paralcomputation can It is suitable for be reduced nearly high-performance computing with its inherent basic high-efficiency operations. By comparing with the numerical results, it is shown that the proposed method has the best performance.
基金Supported by the Fund of Chongqing Education Committee(KJ091104)
文摘In this paper,an efficient conjugate gradient method is given to solve the general unconstrained optimization problems,which can guarantee the sufficient descent property and the global convergence with the strong Wolfe line search conditions.Numerical results show that the new method is efficient and stationary by comparing with PRP+ method,so it can be widely used in scientific computation.
文摘Conjugate gradient optimization algorithms depend on the search directions with different choices for the parameters in the search directions. In this note, by combining the nice numerical performance of PR and HS methods with the global convergence property of the class of conjugate gradient methods presented by HU and STOREY(1991), a class of new restarting conjugate gradient methods is presented. Global convergences of the new method with two kinds of common line searches, are proved. Firstly, it is shown that, using reverse modulus of continuity function and forcing function, the new method for solving unconstrained optimization can work for a continously dif ferentiable function with Curry-Altman's step size rule and a bounded level set. Secondly, by using comparing technique, some general convergence properties of the new method with other kind of step size rule are established. Numerical experiments show that the new method is efficient by comparing with FR conjugate gradient method.
基金Supported by the National Natural Science Foundation of China(10571106) Supported by the Fundamental Research Funds for the Central Universities(10CX04044A)
文摘In this note,by combining the nice numerical performance of PR and HS methods with the global convergence property of FR method,a class of new restarting three terms conjugate gradient methods is presented.Global convergence properties of the new method with two kinds of common line searches are proved.
基金Sponsored by the National Natural Science Foundation of China(Grant No.11901561).
文摘Many methods have been put forward to solve unconstrained optimization problems,among which conjugate gradient method(CG)is very important.With the increasing emergence of large⁃scale problems,the subspace technology has become particularly important and widely used in the field of optimization.In this study,a new CG method was put forward,which combined subspace technology and a cubic regularization model.Besides,a special scaled norm in a cubic regularization model was analyzed.Under certain conditions,some significant characteristics of the search direction were given and the convergence of the algorithm was built.Numerical comparisons show that for the 145 test functions under the CUTEr library,the proposed method is better than two classical CG methods and two new subspaces conjugate gradient methods.
文摘In this paper two theorems with theoretical and practical significance are given in respect to the preconditioned conjugate gradient method (PCCG). The theorems discuss respectively the qualitative property of the iterative solution and the construction principle of the iterative matrix. The authors put forward a new incompletely LU factorizing technique for non-M-matrix and the method of constructing the iterative matrix. This improved PCCG is used to calculate the ill-conditioned problems and large-scale three-dimensional finite element problems, and simultaneously contrasted with other methods. The abnormal phenomenon is analyzed when PCCG is used to solve the system of ill-conditioned equations, ft is shown that the method proposed in this paper is quite effective in solving the system of large-scale finite element equations and the system of ill-conditioned equations.
文摘We study the conjugate gradient method for solving a system of linear equations with coefficients which are measurable functions and establish the rate of convergence of this method.
基金Supported by the National Natural Science Foundation of China(1 0 1 6 1 0 0 2 ) and Guangxi Natural Sci-ence Foundation (0 1 3 5 0 0 4 )
文摘This paper discusses the global convergence of a class of nonmonotone conjugate gra- dient methods(NM methods) for nonconvex object functions.This class of methods includes the nonmonotone counterpart of modified Polak- Ribière method and modified Hestenes- Stiefel method as special cases
文摘Recently, Gilbert and Nocedal([3]) investigated global convergence of conjugate gradient methods related to Polak-Ribiere formular, they restricted beta(k) to non-negative value. [5] discussed the same problem as that in [3] and relaxed beta(k) to be negative with the objective function being convex. This paper allows beta(k) to be selected in a wider range than [5]. Especially, the global convergence of the corresponding algorithm without sufficient decrease condition is proved.
文摘Spectral conjugate gradient method is an algorithm obtained by combination of spectral gradient method and conjugate gradient method,which is characterized with global convergence and simplicity of spectral gradient method,and small storage of conjugate gradient method.Besides,the spectral conjugate gradient method was proved that the search direction at each iteration is a descent direction of objective function even without relying on any line search method.Spectral conjugate gradient method is applied to full waveform inversion for numerical tests on Marmousi model.The authors give a comparison on numerical results obtained by steepest descent method,conjugate gradient method and spectral conjugate gradient method,which shows that the spectral conjugate gradient method is superior to the other two methods.
基金supported by National Natural Science Foundation of China (Grant Nos. 81173633, 11401038 and 11331012)the Chinese Academy of Sciences Grant (Grant No. kjcx-yw-s7-03)+2 种基金National Natural Science Foundation of China for Distinguished Young Scientists (Grant No. 11125107)the Key Project of Chinese National Programs for Fundamental Research and Development (Grant No. 2015CB856000)the Fundamental Research Funds for the Central Universities (Grant No. 2014RC0904)
文摘The linear conjugate gradient method is an optimal method for convex quadratic minimization due to the Krylov subspace minimization property. The proposition of limited-memory BFGS method and Barzilai-Borwein gradient method, however, heavily restricted the use of conjugate gradient method for largescale nonlinear optimization. This is, to the great extent, due to the requirement of a relatively exact line search at each iteration and the loss of conjugacy property of the search directions in various occasions. On the contrary, the limited-memory BFGS method and the Barzilai-Bowein gradient method share the so-called asymptotical one stepsize per line-search property, namely, the trial stepsize in the method will asymptotically be accepted by the line search when the iteration is close to the solution. This paper will focus on the analysis of the subspace minimization conjugate gradient method by Yuan and Stoer(1995). Specifically, if choosing the parameter in the method by combining the Barzilai-Borwein idea, we will be able to provide some efficient Barzilai-Borwein conjugate gradient(BBCG) methods. The initial numerical experiments show that one of the variants, BBCG3, is specially efficient among many others without line searches. This variant of the BBCG method might enjoy the asymptotical one stepsize per line-search property and become a strong candidate for large-scale nonlinear optimization.
基金Research partially supported by Chinese NSF grants 19801033,19771047 and 10171104
文摘In this paper we test different conjugate gradient (CG) methods for solving large-scale unconstrained optimization problems. The methods are divided in two groups: the first group includes five basic CG methods and the second five hybrid CG methods. A collection of medium-scale and large-scale test problems are drawn from a standard code of test problems, CUTE. The conjugate gradient methods are ranked according to the numerical results. Some remarks are given.
基金Supported by the National Natural Science Foundation of China (Grant No.10761001)
文摘It is well-known that the direction generated by Hestenes-Stiefel (HS) conjugate gradient method may not be a descent direction for the objective function. In this paper, we take a little modification to the HS method, then the generated direction always satisfies the sufficient descent condition. An advantage of the modified Hestenes-Stiefel (MHS) method is that the scalar βκHS. keeps nonnegative under the weak Wolfe-Powell line search. The global convergence result of the MHS method is established under some mild conditions. Preliminary numerical results show that the MHS method is a little more efficient than PRP and HS methods.
基金supported by the National Basic Research Program (No.2005CB321702)the China NNSF Outstanding Young Scientist Foundation (No.10525102)the National Natural Science Foundation (No.10471146),P.R.China
文摘The restrictively preconditioned conjugate gradient (RPCG) method is further developed to solve large sparse system of linear equations of a block two-by-two structure. The basic idea of this new approach is that we apply the RPCG method to the normal-residual equation of the block two-by-two linear system and construct each required approximate matrix by making use of the incomplete orthogonal factorization of the involved matrix blocks. Numerical experiments show that the new method, called the restrictively preconditioned conjugate gradient on normal residual (RPCGNR), is more robust and effective than either the known RPCG method or the standard conjugate gradient on normal residual (CGNR) method when being used for solving the large sparse saddle point problems.
基金Supported by the National Natural Science Foundation of China (Grant No.10761001)
文摘In this paper, a modified formula for βk^PRP is proposed for the conjugate gradient method of solving unconstrained optimization problems. The value of βk^PRP keeps nonnegative independent of the line search. Under mild conditions, the global convergence of modified PRP method with the strong Wolfe-Powell line search is established. Preliminary numerical results show that the modified method is efficient.
基金Supported by the Key Project of 2010 Chongqing Higher Education Teaching Reform (Grant No. 102104)
文摘In this paper,we present a new nonlinear modified spectral CD conjugate gradient method for solving large scale unconstrained optimization problems.The direction generated by the method is a descent direction for the objective function,and this property depends neither on the line search rule,nor on the convexity of the objective function.Moreover,the modified method reduces to the standard CD method if line search is exact.Under some mild conditions,we prove that the modified method with line search is globally convergent even if the objective function is nonconvex.Preliminary numerical results show that the proposed method is very promising.