期刊文献+
共找到78篇文章
< 1 2 4 >
每页显示 20 50 100
GLOBAL CONVERGENCE OF THE GENERAL THREE TERM CONJUGATE GRADIENT METHODS WITH THE RELAXED STRONG WOLFE LINE SEARCH
1
作者 Xu Zeshui Yue ZhenjunInstitute of Sciences,PLA University of Science and Technology,Nanjing,210016. 《Applied Mathematics(A Journal of Chinese Universities)》 SCIE CSCD 2001年第1期58-62,共5页
The global convergence of the general three term conjugate gradient methods with the relaxed strong Wolfe line search is proved.
关键词 conjugate gradient method inexact line search global convergence.
下载PDF
A CLASSOF NONMONOTONE CONJUGATE GRADIENT METHODSFOR NONCONVEX FUNCTIONS
2
作者 LiuYun WeiZengxin 《Applied Mathematics(A Journal of Chinese Universities)》 SCIE CSCD 2002年第2期208-214,共7页
This paper discusses the global convergence of a class of nonmonotone conjugate gra- dient methods(NM methods) for nonconvex object functions.This class of methods includes the nonmonotone counterpart of modified Po... This paper discusses the global convergence of a class of nonmonotone conjugate gra- dient methods(NM methods) for nonconvex object functions.This class of methods includes the nonmonotone counterpart of modified Polak- Ribière method and modified Hestenes- Stiefel method as special cases 展开更多
关键词 nonmonotone conjugate gradient method nonmonotone line search global convergence unconstrained optimization.
下载PDF
CONVERGENCE ANALYSIS ON A CLASS OF CONJUGATE GRADIENT METHODS WITHOUTSUFFICIENT DECREASE CONDITION 被引量:1
3
作者 刘光辉 韩继业 +1 位作者 戚厚铎 徐中玲 《Acta Mathematica Scientia》 SCIE CSCD 1998年第1期11-16,共6页
Recently, Gilbert and Nocedal([3]) investigated global convergence of conjugate gradient methods related to Polak-Ribiere formular, they restricted beta(k) to non-negative value. [5] discussed the same problem as that... Recently, Gilbert and Nocedal([3]) investigated global convergence of conjugate gradient methods related to Polak-Ribiere formular, they restricted beta(k) to non-negative value. [5] discussed the same problem as that in [3] and relaxed beta(k) to be negative with the objective function being convex. This paper allows beta(k) to be selected in a wider range than [5]. Especially, the global convergence of the corresponding algorithm without sufficient decrease condition is proved. 展开更多
关键词 Polak-Ribiere conjugate gradient method strong Wolfe line search global convergence
全文增补中
An Adaptive Spectral Conjugate Gradient Method with Restart Strategy
4
作者 Zhou Jincheng Jiang Meixuan +2 位作者 Zhong Zining Wu Yanqiang Shao Hu 《数学理论与应用》 2024年第3期106-118,共13页
As a generalization of the two-term conjugate gradient method(CGM),the spectral CGM is one of the effective methods for solving unconstrained optimization.In this paper,we enhance the JJSL conjugate parameter,initiall... As a generalization of the two-term conjugate gradient method(CGM),the spectral CGM is one of the effective methods for solving unconstrained optimization.In this paper,we enhance the JJSL conjugate parameter,initially proposed by Jiang et al.(Computational and Applied Mathematics,2021,40:174),through the utilization of a convex combination technique.And this improvement allows for an adaptive search direction by integrating a newly constructed spectral gradient-type restart strategy.Then,we develop a new spectral CGM by employing an inexact line search to determine the step size.With the application of the weak Wolfe line search,we establish the sufficient descent property of the proposed search direction.Moreover,under general assumptions,including the employment of the strong Wolfe line search for step size calculation,we demonstrate the global convergence of our new algorithm.Finally,the given unconstrained optimization test results show that the new algorithm is effective. 展开更多
关键词 Unconstrained optimization Spectral conjugate gradient method Restart strategy Inexact line search Global convergence
下载PDF
TESTING DIFFERENT CONJUGATE GRADIENT METHODS FOR LARGE-SCALE UNCONSTRAINED OPTIMIZATION 被引量:10
5
作者 Yu-hongDai QinNi 《Journal of Computational Mathematics》 SCIE CSCD 2003年第3期311-320,共10页
In this paper we test different conjugate gradient (CG) methods for solving large-scale unconstrained optimization problems. The methods are divided in two groups: the first group includes five basic CG methods and th... In this paper we test different conjugate gradient (CG) methods for solving large-scale unconstrained optimization problems. The methods are divided in two groups: the first group includes five basic CG methods and the second five hybrid CG methods. A collection of medium-scale and large-scale test problems are drawn from a standard code of test problems, CUTE. The conjugate gradient methods are ranked according to the numerical results. Some remarks are given. 展开更多
关键词 conjugate gradient methods LARGE-SCALE Unconstrained optimization Numerical tests.
原文传递
PRECONDITIONED CONJUGATE GRADIENT METHODS FOR INTEGRAL EQUATIONS OF THE SECOND KIND DEFINED ON THE HALF-LINE
6
作者 Chan, RH Lin, FR 《Journal of Computational Mathematics》 SCIE CSCD 1996年第3期223-236,共14页
We consider solving integral equations of the second kind defined on the half-line [0, infinity) by the preconditioned conjugate gradient method. Convergence is known to be slow due to the non-compactness of the assoc... We consider solving integral equations of the second kind defined on the half-line [0, infinity) by the preconditioned conjugate gradient method. Convergence is known to be slow due to the non-compactness of the associated integral operator. In this paper, we construct two different circulant integral operators to be used as preconditioners for the method to speed up its convergence rate. We prove that if the given integral operator is close to a convolution-type integral operator, then the preconditioned systems will have spectrum clustered around 1 and hence the preconditioned conjugate gradient method will converge superlinearly. Numerical examples are given to illustrate the fast convergence. 展开更多
关键词 MATH Cr PRECONDITIONED conjugate gradient methods FOR INTEGRAL EQUATIONS OF THE SECOND KIND DEFINED ON THE HALF-LINE PRO III
原文传递
THE RESTRICTIVELY PRECONDITIONED CONJUGATE GRADIENT METHODS ON NORMAL RESIDUAL FOR BLOCK TWO-BY-TWO LINEAR SYSTEMS 被引量:4
7
作者 Junfeng Yin Zhongzhi Bai 《Journal of Computational Mathematics》 SCIE EI CSCD 2008年第2期240-249,共10页
The restrictively preconditioned conjugate gradient (RPCG) method is further developed to solve large sparse system of linear equations of a block two-by-two structure. The basic idea of this new approach is that we... The restrictively preconditioned conjugate gradient (RPCG) method is further developed to solve large sparse system of linear equations of a block two-by-two structure. The basic idea of this new approach is that we apply the RPCG method to the normal-residual equation of the block two-by-two linear system and construct each required approximate matrix by making use of the incomplete orthogonal factorization of the involved matrix blocks. Numerical experiments show that the new method, called the restrictively preconditioned conjugate gradient on normal residual (RPCGNR), is more robust and effective than either the known RPCG method or the standard conjugate gradient on normal residual (CGNR) method when being used for solving the large sparse saddle point problems. 展开更多
关键词 Block two-by-two linear system Saddle point problem Restrictively preconditioned conjugate gradient method Normal-residual equation Incomplete orthogonal factorization
原文传递
Conjugate Gradient Methods with Armijo-type Line Searches 被引量:12
8
作者 Yu-Hong DAIState Key Laboratory of Scientific and Engineering Computing, Institute of Computational Mathematics, Academy of Mathematics and System Sciences, Chinese Academy of Sciences, Beijing 100080, China 《Acta Mathematicae Applicatae Sinica》 SCIE CSCD 2002年第1期123-130,共8页
Two Armijo-type line searches are proposed in this paper for nonlinear conjugate gradient methods. Under these line searches, global convergence results are established for several famous conjugate gradient methods, i... Two Armijo-type line searches are proposed in this paper for nonlinear conjugate gradient methods. Under these line searches, global convergence results are established for several famous conjugate gradient methods, including the Fletcher-Reeves method, the Polak-Ribiere-Polyak method, and the conjugate descent method. 展开更多
关键词 Unconstrained optimization conjugate gradient method line search global convergence
全文增补中
TWO FUNDAMENTAL CONVERGENCE THEOREMS FOR NONLINEAR CONJUGATE GRADIENT METHODS AND THEIR APPLICATIONS 被引量:1
9
作者 韩继业 刘光辉 +1 位作者 孙德锋 尹红霞 《Acta Mathematicae Applicatae Sinica》 SCIE CSCD 2001年第1期38-46,共9页
Two fundamental convergence theorems are given for nonlinear conjugate gradient methods only under the descent condition. As a result, methods related to the Fletcher-Reeves algorithm still converge for parameters in ... Two fundamental convergence theorems are given for nonlinear conjugate gradient methods only under the descent condition. As a result, methods related to the Fletcher-Reeves algorithm still converge for parameters in a slightly wider range, in particular, for a parameter in its upper bound. For methods related to the Polak-Ribiere algorithm, it is shown that some negative values of the conjugate parameter do not prevent convergence. If the objective function is convex, some convergence results hold for the Hestenes-Stiefel algorithm. 展开更多
关键词 conjugate gradient method descent condition global convergence
全文增补中
The Conjugate Gradient Method in Random Variables
10
作者 HSU Ming-hsiu LAI King-fai 《Chinese Quarterly Journal of Mathematics》 2021年第2期111-121,共11页
We study the conjugate gradient method for solving a system of linear equations with coefficients which are measurable functions and establish the rate of convergence of this method.
关键词 Riesz algebra MATRICES Measurable functions Ordered structures conjugate gradient methods Computational methods in function algebras
下载PDF
A modified three–term conjugate gradient method with sufficient descent property 被引量:1
11
作者 Saman Babaie–Kafaki 《Applied Mathematics(A Journal of Chinese Universities)》 SCIE CSCD 2015年第3期263-272,共10页
A hybridization of the three–term conjugate gradient method proposed by Zhang et al. and the nonlinear conjugate gradient method proposed by Polak and Ribi`ere, and Polyak is suggested. Based on an eigenvalue analysi... A hybridization of the three–term conjugate gradient method proposed by Zhang et al. and the nonlinear conjugate gradient method proposed by Polak and Ribi`ere, and Polyak is suggested. Based on an eigenvalue analysis, it is shown that search directions of the proposed method satisfy the sufficient descent condition, independent of the line search and the objective function convexity. Global convergence of the method is established under an Armijo–type line search condition. Numerical experiments show practical efficiency of the proposed method. 展开更多
关键词 unconstrained optimization conjugate gradient method EIGENVALUE sufficient descent condition global convergence
下载PDF
High-efciency improved symmetric successive over-relaxation preconditioned conjugate gradient method for solving large-scale finite element linear equations 被引量:1
12
作者 李根 唐春安 李连崇 《Applied Mathematics and Mechanics(English Edition)》 SCIE EI 2013年第10期1225-1236,共12页
Fast solving large-scale linear equations in the finite element analysis is a classical subject in computational mechanics. It is a key technique in computer aided engineering (CAE) and computer aided manufacturing ... Fast solving large-scale linear equations in the finite element analysis is a classical subject in computational mechanics. It is a key technique in computer aided engineering (CAE) and computer aided manufacturing (CAM). This paper presents a high-efficiency improved symmetric successive over-relaxation (ISSOR) preconditioned conjugate gradient (PCG) method, which maintains lelism consistent with the original form. Ideally, the by 50% as compared with the original algorithm. the convergence and inherent paralcomputation can It is suitable for be reduced nearly high-performance computing with its inherent basic high-efficiency operations. By comparing with the numerical results, it is shown that the proposed method has the best performance. 展开更多
关键词 improved preconditioned conjugate gradient (PCG) method conjugate gradient method large-scale linear equation finite element method
下载PDF
A New Nonlinear Conjugate Gradient Method for Unconstrained Optimization Problems 被引量:1
13
作者 LIU Jin-kui WANG Kai-rong +1 位作者 SONG Xiao-qian DU Xiang-lin 《Chinese Quarterly Journal of Mathematics》 CSCD 2010年第3期444-450,共7页
In this paper,an efficient conjugate gradient method is given to solve the general unconstrained optimization problems,which can guarantee the sufficient descent property and the global convergence with the strong Wol... In this paper,an efficient conjugate gradient method is given to solve the general unconstrained optimization problems,which can guarantee the sufficient descent property and the global convergence with the strong Wolfe line search conditions.Numerical results show that the new method is efficient and stationary by comparing with PRP+ method,so it can be widely used in scientific computation. 展开更多
关键词 unconstrained optimization conjugate gradient method strong Wolfe line search sufficient descent property global convergence
下载PDF
Global Convergence of a New Restarting Conjugate Gradient Method for Nonlinear Optimizations 被引量:1
14
作者 SUN Qing-ying(Department of Applied Mathematics, Dalian University of Technology, Dalian 116024, China Department of Applied Mathematics, University of Petroleum , Dongying 257061, China) 《Chinese Quarterly Journal of Mathematics》 CSCD 2003年第2期154-162,共9页
Conjugate gradient optimization algorithms depend on the search directions with different choices for the parameters in the search directions. In this note, by combining the nice numerical performance of PR and HS met... Conjugate gradient optimization algorithms depend on the search directions with different choices for the parameters in the search directions. In this note, by combining the nice numerical performance of PR and HS methods with the global convergence property of the class of conjugate gradient methods presented by HU and STOREY(1991), a class of new restarting conjugate gradient methods is presented. Global convergences of the new method with two kinds of common line searches, are proved. Firstly, it is shown that, using reverse modulus of continuity function and forcing function, the new method for solving unconstrained optimization can work for a continously dif ferentiable function with Curry-Altman's step size rule and a bounded level set. Secondly, by using comparing technique, some general convergence properties of the new method with other kind of step size rule are established. Numerical experiments show that the new method is efficient by comparing with FR conjugate gradient method. 展开更多
关键词 nonlinear programming restarting conjugate gradient method forcing function reverse modulus of continuity function CONVERGENCE
下载PDF
Global Convergence of a New Restarting Three Terms Conjugate Gradient Method for Non-linear Optimizations 被引量:1
15
作者 SUN Qing-ying SANG Zhao-yang TIAN Feng-ting 《Chinese Quarterly Journal of Mathematics》 CSCD 2011年第1期69-76,共8页
In this note,by combining the nice numerical performance of PR and HS methods with the global convergence property of FR method,a class of new restarting three terms conjugate gradient methods is presented.Global conv... In this note,by combining the nice numerical performance of PR and HS methods with the global convergence property of FR method,a class of new restarting three terms conjugate gradient methods is presented.Global convergence properties of the new method with two kinds of common line searches are proved. 展开更多
关键词 nonlinear programming restarting three terms conjugate gradient method forcing function reverse modulus of continuity function convergence
下载PDF
Subspace Minimization Conjugate Gradient Method Based on Cubic Regularization Model for Unconstrained Optimization 被引量:1
16
作者 Ting Zhao Hongwei Liu 《Journal of Harbin Institute of Technology(New Series)》 CAS 2021年第5期61-69,共9页
Many methods have been put forward to solve unconstrained optimization problems,among which conjugate gradient method(CG)is very important.With the increasing emergence of large⁃scale problems,the subspace technology ... Many methods have been put forward to solve unconstrained optimization problems,among which conjugate gradient method(CG)is very important.With the increasing emergence of large⁃scale problems,the subspace technology has become particularly important and widely used in the field of optimization.In this study,a new CG method was put forward,which combined subspace technology and a cubic regularization model.Besides,a special scaled norm in a cubic regularization model was analyzed.Under certain conditions,some significant characteristics of the search direction were given and the convergence of the algorithm was built.Numerical comparisons show that for the 145 test functions under the CUTEr library,the proposed method is better than two classical CG methods and two new subspaces conjugate gradient methods. 展开更多
关键词 cubic regularization model conjugate gradient method subspace technique unconstrained optimization
下载PDF
A New Conjugate Gradient Projection Method for Solving Stochastic Generalized Linear Complementarity Problems 被引量:2
17
作者 Zhimin Liu Shouqiang Du Ruiying Wang 《Journal of Applied Mathematics and Physics》 2016年第6期1024-1031,共8页
In this paper, a class of the stochastic generalized linear complementarity problems with finitely many elements is proposed for the first time. Based on the Fischer-Burmeister function, a new conjugate gradient proje... In this paper, a class of the stochastic generalized linear complementarity problems with finitely many elements is proposed for the first time. Based on the Fischer-Burmeister function, a new conjugate gradient projection method is given for solving the stochastic generalized linear complementarity problems. The global convergence of the conjugate gradient projection method is proved and the related numerical results are also reported. 展开更多
关键词 Stochastic Generalized Linear Complementarity Problems Fischer-Burmeister Function conjugate gradient Projection Method Global Convergence
下载PDF
IMPROVED PRECONDITIONED CONJUGATE GRADIENT METHOD AND ITS APPLICATION IN F.E.A.FOR ENGINEERING
18
作者 郑宏 葛修润 《Applied Mathematics and Mechanics(English Edition)》 SCIE EI 1993年第4期371-380,共10页
In this paper two theorems with theoretical and practical significance are given in respect to the preconditioned conjugate gradient method (PCCG). The theorems discuss respectively the qualitative property of the ite... In this paper two theorems with theoretical and practical significance are given in respect to the preconditioned conjugate gradient method (PCCG). The theorems discuss respectively the qualitative property of the iterative solution and the construction principle of the iterative matrix. The authors put forward a new incompletely LU factorizing technique for non-M-matrix and the method of constructing the iterative matrix. This improved PCCG is used to calculate the ill-conditioned problems and large-scale three-dimensional finite element problems, and simultaneously contrasted with other methods. The abnormal phenomenon is analyzed when PCCG is used to solve the system of ill-conditioned equations, ft is shown that the method proposed in this paper is quite effective in solving the system of large-scale finite element equations and the system of ill-conditioned equations. 展开更多
关键词 preconditioned conjugate gradient method finite element ill-conditioned problems
下载PDF
Full waveform inversion with spectral conjugategradient method
19
作者 LIU Xiao LIU Mingchen +1 位作者 SUN Hui WANG Qianlong 《Global Geology》 2017年第1期40-45,共6页
Spectral conjugate gradient method is an algorithm obtained by combination of spectral gradient method and conjugate gradient method,which is characterized with global convergence and simplicity of spectral gradient m... Spectral conjugate gradient method is an algorithm obtained by combination of spectral gradient method and conjugate gradient method,which is characterized with global convergence and simplicity of spectral gradient method,and small storage of conjugate gradient method.Besides,the spectral conjugate gradient method was proved that the search direction at each iteration is a descent direction of objective function even without relying on any line search method.Spectral conjugate gradient method is applied to full waveform inversion for numerical tests on Marmousi model.The authors give a comparison on numerical results obtained by steepest descent method,conjugate gradient method and spectral conjugate gradient method,which shows that the spectral conjugate gradient method is superior to the other two methods. 展开更多
关键词 ful l waveform inversion spectral conjugate gradient method conjugate gradient method steepest descent method
下载PDF
An Efficient Projected Gradient Method for Convex Constrained Monotone Equations with Applications in Compressive Sensing 被引量:1
20
作者 Yaping Hu Yujie Wang 《Journal of Applied Mathematics and Physics》 2020年第6期983-998,共16页
In this paper, a modified Polak-Ribière-Polyak conjugate gradient projection method is proposed for solving large scale nonlinear convex constrained monotone equations based on the projection method of Solodov an... In this paper, a modified Polak-Ribière-Polyak conjugate gradient projection method is proposed for solving large scale nonlinear convex constrained monotone equations based on the projection method of Solodov and Svaiter. The obtained method has low-complexity property and converges globally. Furthermore, this method has also been extended to solve the sparse signal reconstruction in compressive sensing. Numerical experiments illustrate the efficiency of the given method and show that such non-monotone method is suitable for some large scale problems. 展开更多
关键词 Projection Method Monotone Equations conjugate gradient Method Compressive Sensing
下载PDF
上一页 1 2 4 下一页 到第
使用帮助 返回顶部