Based on the analysis of the conjugate gradient algorithm, we implement a threedimensional (3D) conjugate gradient inversion algorithm with magnetotelluric impedance data. During the inversion process, the 3D conjug...Based on the analysis of the conjugate gradient algorithm, we implement a threedimensional (3D) conjugate gradient inversion algorithm with magnetotelluric impedance data. During the inversion process, the 3D conjugate gradient inversion algorithm doesn' t need to compute and store the Jacobian matrix but directly updates the model from the computation of the Jacobian matrix. Requiring only one forward and four pseudo-forward modeling applications per frequency to produce the model update at each iteration, this algorithm efficiently reduces the computation of the inversion. From a trial inversion with synthetic magnetotelluric data, the validity and stability of the 3D conjugate gradient inversion algorithm is verified.展开更多
In seismic data processing, blind deconvolution is a key technology. Introduced in this paper is a flow of one kind of blind deconvolution. The optimal precondition conjugate gradients (PCG) in Kyrlov subspace is als...In seismic data processing, blind deconvolution is a key technology. Introduced in this paper is a flow of one kind of blind deconvolution. The optimal precondition conjugate gradients (PCG) in Kyrlov subspace is also used to improve the stability of the algorithm. The computation amount is greatly decreased.展开更多
Although conventional reverse time migration can be perfectly applied to structural imaging it lacks the capability of enabling detailed delineation of a lithological reservoir due to irregular illumination. To obtain...Although conventional reverse time migration can be perfectly applied to structural imaging it lacks the capability of enabling detailed delineation of a lithological reservoir due to irregular illumination. To obtain reliable reflectivity of the subsurface it is necessary to solve the imaging problem using inversion. The least-square reverse time migration (LSRTM) (also known as linearized refleetivity inversion) aims to obtain relatively high-resolution amplitude preserving imaging by including the inverse of the Hessian matrix. In practice, the conjugate gradient algorithm is proven to be an efficient iterative method for enabling use of LSRTM. The velocity gradient can be derived from a cross-correlation between observed data and simulated data, making LSRTM independent of wavelet signature and thus more robust in practice. Tests on synthetic and marine data show that LSRTM has good potential for use in reservoir description and four-dimensional (4D) seismic images compared to traditional RTM and Fourier finite difference (FFD) migration. This paper investigates the first order approximation of LSRTM, which is also known as the linear Born approximation. However, for more complex geological structures a higher order approximation should be considered to improve imaging quality.展开更多
Fast solving large-scale linear equations in the finite element analysis is a classical subject in computational mechanics. It is a key technique in computer aided engineering (CAE) and computer aided manufacturing ...Fast solving large-scale linear equations in the finite element analysis is a classical subject in computational mechanics. It is a key technique in computer aided engineering (CAE) and computer aided manufacturing (CAM). This paper presents a high-efficiency improved symmetric successive over-relaxation (ISSOR) preconditioned conjugate gradient (PCG) method, which maintains lelism consistent with the original form. Ideally, the by 50% as compared with the original algorithm. the convergence and inherent paralcomputation can It is suitable for be reduced nearly high-performance computing with its inherent basic high-efficiency operations. By comparing with the numerical results, it is shown that the proposed method has the best performance.展开更多
The electronic structure of GaAs/Al xGa 1-x As superlattices has been investigated by an ab initio calculation method—the conjugate gradient (CG) approach.In order to determine that,a conventional CG scheme is m...The electronic structure of GaAs/Al xGa 1-x As superlattices has been investigated by an ab initio calculation method—the conjugate gradient (CG) approach.In order to determine that,a conventional CG scheme is modified for our superlattices:First,apart from the former scheme,for the fixed electron density n(z),the eigenvalues and eigenfunctions are calculated,and then by using those,reconstruct the new n(z).Also,for every k z,we apply the CG schemes independently.The calculated energy difference between two minibands,and Fermi energy are in good agreement with the experimental data.展开更多
Based on the analysis of impedance tensor data, tipper data, and the conjugate gradient algorithm, we develop a three-dimensional (3D) conjugate gradient algorithm for inverting magnetotelluric full information data...Based on the analysis of impedance tensor data, tipper data, and the conjugate gradient algorithm, we develop a three-dimensional (3D) conjugate gradient algorithm for inverting magnetotelluric full information data determined from five electric and magnetic field components and discuss the method to use the full information data for quantitative interpretation of 3D inversion results. Results from the 3D inversion of synthetic data indicate that the results from inverting full information data which combine the impedance tensor and tipper data are better than results from inverting only the impedance tensor data (or tipper data) in improving resolution and reliability. The synthetic examples also demonstrate the validity and stability of this 3D inversion algorithm.展开更多
With the continuous development of full tensor gradiometer (FTG) measurement techniques, three-dimensional (3D) inversion of FTG data is becoming increasingly used in oil and gas exploration. In the fast processin...With the continuous development of full tensor gradiometer (FTG) measurement techniques, three-dimensional (3D) inversion of FTG data is becoming increasingly used in oil and gas exploration. In the fast processing and interpretation of large-scale high-precision data, the use of the graphics processing unit process unit (GPU) and preconditioning methods are very important in the data inversion. In this paper, an improved preconditioned conjugate gradient algorithm is proposed by combining the symmetric successive over-relaxation (SSOR) technique and the incomplete Choleksy decomposition conjugate gradient algorithm (ICCG). Since preparing the preconditioner requires extra time, a parallel implement based on GPU is proposed. The improved method is then applied in the inversion of noise- contaminated synthetic data to prove its adaptability in the inversion of 3D FTG data. Results show that the parallel SSOR-ICCG algorithm based on NVIDIA Tesla C2050 GPU achieves a speedup of approximately 25 times that of a serial program using a 2.0 GHz Central Processing Unit (CPU). Real airbome gravity-gradiometry data from Vinton salt dome (south- west Louisiana, USA) are also considered. Good results are obtained, which verifies the efficiency and feasibility of the proposed parallel method in fast inversion of 3D FTG data.展开更多
Large-scale multi-objective optimization problems(LSMOPs)pose challenges to existing optimizers since a set of well-converged and diverse solutions should be found in huge search spaces.While evolutionary algorithms a...Large-scale multi-objective optimization problems(LSMOPs)pose challenges to existing optimizers since a set of well-converged and diverse solutions should be found in huge search spaces.While evolutionary algorithms are good at solving small-scale multi-objective optimization problems,they are criticized for low efficiency in converging to the optimums of LSMOPs.By contrast,mathematical programming methods offer fast convergence speed on large-scale single-objective optimization problems,but they have difficulties in finding diverse solutions for LSMOPs.Currently,how to integrate evolutionary algorithms with mathematical programming methods to solve LSMOPs remains unexplored.In this paper,a hybrid algorithm is tailored for LSMOPs by coupling differential evolution and a conjugate gradient method.On the one hand,conjugate gradients and differential evolution are used to update different decision variables of a set of solutions,where the former drives the solutions to quickly converge towards the Pareto front and the latter promotes the diversity of the solutions to cover the whole Pareto front.On the other hand,objective decomposition strategy of evolutionary multi-objective optimization is used to differentiate the conjugate gradients of solutions,and the line search strategy of mathematical programming is used to ensure the higher quality of each offspring than its parent.In comparison with state-of-the-art evolutionary algorithms,mathematical programming methods,and hybrid algorithms,the proposed algorithm exhibits better convergence and diversity performance on a variety of benchmark and real-world LSMOPs.展开更多
A hybridization of the three–term conjugate gradient method proposed by Zhang et al. and the nonlinear conjugate gradient method proposed by Polak and Ribi`ere, and Polyak is suggested. Based on an eigenvalue analysi...A hybridization of the three–term conjugate gradient method proposed by Zhang et al. and the nonlinear conjugate gradient method proposed by Polak and Ribi`ere, and Polyak is suggested. Based on an eigenvalue analysis, it is shown that search directions of the proposed method satisfy the sufficient descent condition, independent of the line search and the objective function convexity. Global convergence of the method is established under an Armijo–type line search condition. Numerical experiments show practical efficiency of the proposed method.展开更多
In this paper,an efficient conjugate gradient method is given to solve the general unconstrained optimization problems,which can guarantee the sufficient descent property and the global convergence with the strong Wol...In this paper,an efficient conjugate gradient method is given to solve the general unconstrained optimization problems,which can guarantee the sufficient descent property and the global convergence with the strong Wolfe line search conditions.Numerical results show that the new method is efficient and stationary by comparing with PRP+ method,so it can be widely used in scientific computation.展开更多
Conjugate gradient optimization algorithms depend on the search directions with different choices for the parameters in the search directions. In this note, by combining the nice numerical performance of PR and HS met...Conjugate gradient optimization algorithms depend on the search directions with different choices for the parameters in the search directions. In this note, by combining the nice numerical performance of PR and HS methods with the global convergence property of the class of conjugate gradient methods presented by HU and STOREY(1991), a class of new restarting conjugate gradient methods is presented. Global convergences of the new method with two kinds of common line searches, are proved. Firstly, it is shown that, using reverse modulus of continuity function and forcing function, the new method for solving unconstrained optimization can work for a continously dif ferentiable function with Curry-Altman's step size rule and a bounded level set. Secondly, by using comparing technique, some general convergence properties of the new method with other kind of step size rule are established. Numerical experiments show that the new method is efficient by comparing with FR conjugate gradient method.展开更多
In this note,by combining the nice numerical performance of PR and HS methods with the global convergence property of FR method,a class of new restarting three terms conjugate gradient methods is presented.Global conv...In this note,by combining the nice numerical performance of PR and HS methods with the global convergence property of FR method,a class of new restarting three terms conjugate gradient methods is presented.Global convergence properties of the new method with two kinds of common line searches are proved.展开更多
Many methods have been put forward to solve unconstrained optimization problems,among which conjugate gradient method(CG)is very important.With the increasing emergence of large⁃scale problems,the subspace technology ...Many methods have been put forward to solve unconstrained optimization problems,among which conjugate gradient method(CG)is very important.With the increasing emergence of large⁃scale problems,the subspace technology has become particularly important and widely used in the field of optimization.In this study,a new CG method was put forward,which combined subspace technology and a cubic regularization model.Besides,a special scaled norm in a cubic regularization model was analyzed.Under certain conditions,some significant characteristics of the search direction were given and the convergence of the algorithm was built.Numerical comparisons show that for the 145 test functions under the CUTEr library,the proposed method is better than two classical CG methods and two new subspaces conjugate gradient methods.展开更多
In [3] Liu et al. investigated global convergence of conjugate gradient methods. In that paper they allowed βκ to be selected in a wider range and the global convergence of the corresponding algorithm without suffic...In [3] Liu et al. investigated global convergence of conjugate gradient methods. In that paper they allowed βκ to be selected in a wider range and the global convergence of the corresponding algorithm without sufficient decrease condition was proved. This paper investigates global convergence of nonmonotone conjugate gradient method under the same conditions.展开更多
In this paper, a class of the stochastic generalized linear complementarity problems with finitely many elements is proposed for the first time. Based on the Fischer-Burmeister function, a new conjugate gradient proje...In this paper, a class of the stochastic generalized linear complementarity problems with finitely many elements is proposed for the first time. Based on the Fischer-Burmeister function, a new conjugate gradient projection method is given for solving the stochastic generalized linear complementarity problems. The global convergence of the conjugate gradient projection method is proved and the related numerical results are also reported.展开更多
The fast convergence without initial value dependence is the key to solving large angle relative orientation.Therefore,a hybrid conjugate gradient algorithm is proposed in this paper.The concrete process is:①stochast...The fast convergence without initial value dependence is the key to solving large angle relative orientation.Therefore,a hybrid conjugate gradient algorithm is proposed in this paper.The concrete process is:①stochastic hill climbing(SHC)algorithm is used to make a random disturbance to the given initial value of the relative orientation element,and the new value to guarantee the optimization direction is generated.②In local optimization,a super-linear convergent conjugate gradient method is used to replace the steepest descent method in relative orientation to improve its convergence rate.③The global convergence condition is that the calculation error is less than the prescribed limit error.The comparison experiment shows that the method proposed in this paper is independent of the initial value,and has higher accuracy and fewer iterations.展开更多
The preconditioned conjugate gradient deconvolution method combines the realization of sparse deconvolution and the optimal preconditioned conjugate gradient method to invert to reflection coefficients. This method ca...The preconditioned conjugate gradient deconvolution method combines the realization of sparse deconvolution and the optimal preconditioned conjugate gradient method to invert to reflection coefficients. This method can enhance the frequency of seismic data processing and widen the valid frequency bandwidth. Considering the time-varying nature of seismic signals, we replace the constant wavelet with a multi-scale time-varying wavelet during deconvolution. Numerical tests show that this method can obtain good application results.展开更多
Two new formulaes of the main parameter βk of the conjugate gradient method are presented, which respectively can be seen as the modifications of method HS and PRP. In comparison with classic conjugate gradient metho...Two new formulaes of the main parameter βk of the conjugate gradient method are presented, which respectively can be seen as the modifications of method HS and PRP. In comparison with classic conjugate gradient methods, the new methods take both available gradient and function value information. Furthermore, their modifications are proposed. These methods are shown to be global convergent under some assumptions. Numerical results are also reported.展开更多
We extend a results presented by Y.F. Hu and C.Storey (1991) [1] on the global convergence result for conjugate gradient methods with different choices for the parameter β k . In this note, the condit...We extend a results presented by Y.F. Hu and C.Storey (1991) [1] on the global convergence result for conjugate gradient methods with different choices for the parameter β k . In this note, the conditions given on β k are milder than that used by Y.F. Hu and C. Storey.展开更多
It is well known that Newton and quasi-Newton algorithms are effective to small and medium scale smooth problems because they take full use of corresponding gradient function’s information but fail to solve nonsmooth...It is well known that Newton and quasi-Newton algorithms are effective to small and medium scale smooth problems because they take full use of corresponding gradient function’s information but fail to solve nonsmooth problems.The perfect algorithm stems from concept of‘bundle’successfully addresses both smooth and nonsmooth complex problems,but it is regrettable that it is merely effective to small and medium optimization models since it needs to store and update relevant information of parameter’s bundle.The conjugate gradient algorithm is effective both large-scale smooth and nonsmooth optimization model since its simplicity that utilizes objective function’s information and the technique of Moreau-Yosida regularization.Thus,a modified three-term conjugate gradient algorithm was proposed,and it has a sufficiently descent property and a trust region character.At the same time,it possesses the global convergence under mild assumptions and numerical test proves it is efficient than similar optimization algorithms.展开更多
基金sponsored by National Natural Science Foundation of China (Grant Nos. 40774029, 40674037, and 40374024)the National Hi-tech Research and Development Program of China (863 Program) (No. 2007AA09Z310)the Program for New Century Excellent Talents in University (NCET).
文摘Based on the analysis of the conjugate gradient algorithm, we implement a threedimensional (3D) conjugate gradient inversion algorithm with magnetotelluric impedance data. During the inversion process, the 3D conjugate gradient inversion algorithm doesn' t need to compute and store the Jacobian matrix but directly updates the model from the computation of the Jacobian matrix. Requiring only one forward and four pseudo-forward modeling applications per frequency to produce the model update at each iteration, this algorithm efficiently reduces the computation of the inversion. From a trial inversion with synthetic magnetotelluric data, the validity and stability of the 3D conjugate gradient inversion algorithm is verified.
基金With the support of the key project of Knowledge Innovation, CAS(KZCX1-y01, KZCX-SW-18), Fund of the China National Natural Sciences and the Daqing Oilfield with Grant No. 49894190
文摘In seismic data processing, blind deconvolution is a key technology. Introduced in this paper is a flow of one kind of blind deconvolution. The optimal precondition conjugate gradients (PCG) in Kyrlov subspace is also used to improve the stability of the algorithm. The computation amount is greatly decreased.
基金sponsored by The National Natural Science Fund(No.41574098)Sinopec Geophysical Key Laboratory Open Fund(No.wtyjy-wx2016-04-2)
文摘Although conventional reverse time migration can be perfectly applied to structural imaging it lacks the capability of enabling detailed delineation of a lithological reservoir due to irregular illumination. To obtain reliable reflectivity of the subsurface it is necessary to solve the imaging problem using inversion. The least-square reverse time migration (LSRTM) (also known as linearized refleetivity inversion) aims to obtain relatively high-resolution amplitude preserving imaging by including the inverse of the Hessian matrix. In practice, the conjugate gradient algorithm is proven to be an efficient iterative method for enabling use of LSRTM. The velocity gradient can be derived from a cross-correlation between observed data and simulated data, making LSRTM independent of wavelet signature and thus more robust in practice. Tests on synthetic and marine data show that LSRTM has good potential for use in reservoir description and four-dimensional (4D) seismic images compared to traditional RTM and Fourier finite difference (FFD) migration. This paper investigates the first order approximation of LSRTM, which is also known as the linear Born approximation. However, for more complex geological structures a higher order approximation should be considered to improve imaging quality.
基金Project supported by the National Natural Science Foundation of China(Nos.5130926141030747+3 种基金41102181and 51121005)the National Basic Research Program of China(973 Program)(No.2011CB013503)the Young Teachers’ Initial Funding Scheme of Sun Yat-sen University(No.39000-1188140)
文摘Fast solving large-scale linear equations in the finite element analysis is a classical subject in computational mechanics. It is a key technique in computer aided engineering (CAE) and computer aided manufacturing (CAM). This paper presents a high-efficiency improved symmetric successive over-relaxation (ISSOR) preconditioned conjugate gradient (PCG) method, which maintains lelism consistent with the original form. Ideally, the by 50% as compared with the original algorithm. the convergence and inherent paralcomputation can It is suitable for be reduced nearly high-performance computing with its inherent basic high-efficiency operations. By comparing with the numerical results, it is shown that the proposed method has the best performance.
基金Supported by National Natural Science Foundation of China(No.50 0 72 0 1 5 and No.5980 1 0 0 6) and Tianjin Youth Foundation o
文摘The electronic structure of GaAs/Al xGa 1-x As superlattices has been investigated by an ab initio calculation method—the conjugate gradient (CG) approach.In order to determine that,a conventional CG scheme is modified for our superlattices:First,apart from the former scheme,for the fixed electron density n(z),the eigenvalues and eigenfunctions are calculated,and then by using those,reconstruct the new n(z).Also,for every k z,we apply the CG schemes independently.The calculated energy difference between two minibands,and Fermi energy are in good agreement with the experimental data.
基金supported by the National Hi-tech Research and Development Program of China(863Program)(No.2007AA09Z310) National Natural Science Foundation of China(Grant No.40774029 40374024)+1 种基金 the Fundamental Research Funds for the Central Universities(Grant No.2010ZY53) the Program for New Century Excellent Talents in University(NCET)
文摘Based on the analysis of impedance tensor data, tipper data, and the conjugate gradient algorithm, we develop a three-dimensional (3D) conjugate gradient algorithm for inverting magnetotelluric full information data determined from five electric and magnetic field components and discuss the method to use the full information data for quantitative interpretation of 3D inversion results. Results from the 3D inversion of synthetic data indicate that the results from inverting full information data which combine the impedance tensor and tipper data are better than results from inverting only the impedance tensor data (or tipper data) in improving resolution and reliability. The synthetic examples also demonstrate the validity and stability of this 3D inversion algorithm.
基金the Sub-project of National Science and Technology Major Project of China(No.2016ZX05027-002-003)the National Natural Science Foundation of China(No.41404089)+1 种基金the State Key Program of National Natural Science of China(No.41430322)the National Basic Research Program of China(973 Program)(No.2015CB45300)
文摘With the continuous development of full tensor gradiometer (FTG) measurement techniques, three-dimensional (3D) inversion of FTG data is becoming increasingly used in oil and gas exploration. In the fast processing and interpretation of large-scale high-precision data, the use of the graphics processing unit process unit (GPU) and preconditioning methods are very important in the data inversion. In this paper, an improved preconditioned conjugate gradient algorithm is proposed by combining the symmetric successive over-relaxation (SSOR) technique and the incomplete Choleksy decomposition conjugate gradient algorithm (ICCG). Since preparing the preconditioner requires extra time, a parallel implement based on GPU is proposed. The improved method is then applied in the inversion of noise- contaminated synthetic data to prove its adaptability in the inversion of 3D FTG data. Results show that the parallel SSOR-ICCG algorithm based on NVIDIA Tesla C2050 GPU achieves a speedup of approximately 25 times that of a serial program using a 2.0 GHz Central Processing Unit (CPU). Real airbome gravity-gradiometry data from Vinton salt dome (south- west Louisiana, USA) are also considered. Good results are obtained, which verifies the efficiency and feasibility of the proposed parallel method in fast inversion of 3D FTG data.
基金supported in part by the National Key Research and Development Program of China(2018AAA0100100)the National Natural Science Foundation of China(61906001,62136008,U21A20512)+1 种基金the Key Program of Natural Science Project of Educational Commission of Anhui Province(KJ2020A0036)Alexander von Humboldt Professorship for Artificial Intelligence Funded by the Federal Ministry of Education and Research,Germany。
文摘Large-scale multi-objective optimization problems(LSMOPs)pose challenges to existing optimizers since a set of well-converged and diverse solutions should be found in huge search spaces.While evolutionary algorithms are good at solving small-scale multi-objective optimization problems,they are criticized for low efficiency in converging to the optimums of LSMOPs.By contrast,mathematical programming methods offer fast convergence speed on large-scale single-objective optimization problems,but they have difficulties in finding diverse solutions for LSMOPs.Currently,how to integrate evolutionary algorithms with mathematical programming methods to solve LSMOPs remains unexplored.In this paper,a hybrid algorithm is tailored for LSMOPs by coupling differential evolution and a conjugate gradient method.On the one hand,conjugate gradients and differential evolution are used to update different decision variables of a set of solutions,where the former drives the solutions to quickly converge towards the Pareto front and the latter promotes the diversity of the solutions to cover the whole Pareto front.On the other hand,objective decomposition strategy of evolutionary multi-objective optimization is used to differentiate the conjugate gradients of solutions,and the line search strategy of mathematical programming is used to ensure the higher quality of each offspring than its parent.In comparison with state-of-the-art evolutionary algorithms,mathematical programming methods,and hybrid algorithms,the proposed algorithm exhibits better convergence and diversity performance on a variety of benchmark and real-world LSMOPs.
基金Supported by Research Council of Semnan University
文摘A hybridization of the three–term conjugate gradient method proposed by Zhang et al. and the nonlinear conjugate gradient method proposed by Polak and Ribi`ere, and Polyak is suggested. Based on an eigenvalue analysis, it is shown that search directions of the proposed method satisfy the sufficient descent condition, independent of the line search and the objective function convexity. Global convergence of the method is established under an Armijo–type line search condition. Numerical experiments show practical efficiency of the proposed method.
基金Supported by the Fund of Chongqing Education Committee(KJ091104)
文摘In this paper,an efficient conjugate gradient method is given to solve the general unconstrained optimization problems,which can guarantee the sufficient descent property and the global convergence with the strong Wolfe line search conditions.Numerical results show that the new method is efficient and stationary by comparing with PRP+ method,so it can be widely used in scientific computation.
文摘Conjugate gradient optimization algorithms depend on the search directions with different choices for the parameters in the search directions. In this note, by combining the nice numerical performance of PR and HS methods with the global convergence property of the class of conjugate gradient methods presented by HU and STOREY(1991), a class of new restarting conjugate gradient methods is presented. Global convergences of the new method with two kinds of common line searches, are proved. Firstly, it is shown that, using reverse modulus of continuity function and forcing function, the new method for solving unconstrained optimization can work for a continously dif ferentiable function with Curry-Altman's step size rule and a bounded level set. Secondly, by using comparing technique, some general convergence properties of the new method with other kind of step size rule are established. Numerical experiments show that the new method is efficient by comparing with FR conjugate gradient method.
基金Supported by the National Natural Science Foundation of China(10571106) Supported by the Fundamental Research Funds for the Central Universities(10CX04044A)
文摘In this note,by combining the nice numerical performance of PR and HS methods with the global convergence property of FR method,a class of new restarting three terms conjugate gradient methods is presented.Global convergence properties of the new method with two kinds of common line searches are proved.
基金Sponsored by the National Natural Science Foundation of China(Grant No.11901561).
文摘Many methods have been put forward to solve unconstrained optimization problems,among which conjugate gradient method(CG)is very important.With the increasing emergence of large⁃scale problems,the subspace technology has become particularly important and widely used in the field of optimization.In this study,a new CG method was put forward,which combined subspace technology and a cubic regularization model.Besides,a special scaled norm in a cubic regularization model was analyzed.Under certain conditions,some significant characteristics of the search direction were given and the convergence of the algorithm was built.Numerical comparisons show that for the 145 test functions under the CUTEr library,the proposed method is better than two classical CG methods and two new subspaces conjugate gradient methods.
基金Supported by the National Science Foundation of China(10171055)
文摘In [3] Liu et al. investigated global convergence of conjugate gradient methods. In that paper they allowed βκ to be selected in a wider range and the global convergence of the corresponding algorithm without sufficient decrease condition was proved. This paper investigates global convergence of nonmonotone conjugate gradient method under the same conditions.
文摘In this paper, a class of the stochastic generalized linear complementarity problems with finitely many elements is proposed for the first time. Based on the Fischer-Burmeister function, a new conjugate gradient projection method is given for solving the stochastic generalized linear complementarity problems. The global convergence of the conjugate gradient projection method is proved and the related numerical results are also reported.
基金National Natural Science Foundation of China(Nos.4156108241161061)。
文摘The fast convergence without initial value dependence is the key to solving large angle relative orientation.Therefore,a hybrid conjugate gradient algorithm is proposed in this paper.The concrete process is:①stochastic hill climbing(SHC)algorithm is used to make a random disturbance to the given initial value of the relative orientation element,and the new value to guarantee the optimization direction is generated.②In local optimization,a super-linear convergent conjugate gradient method is used to replace the steepest descent method in relative orientation to improve its convergence rate.③The global convergence condition is that the calculation error is less than the prescribed limit error.The comparison experiment shows that the method proposed in this paper is independent of the initial value,and has higher accuracy and fewer iterations.
基金This research is sponsored by Key Project of Knowledge Innovation of chinese Academy of Sciences (No. KZCX1-SW-18) and the Precative Project of the Research Institute of Exploration and Development of Daqing Oilfield Co., Ltd.
文摘The preconditioned conjugate gradient deconvolution method combines the realization of sparse deconvolution and the optimal preconditioned conjugate gradient method to invert to reflection coefficients. This method can enhance the frequency of seismic data processing and widen the valid frequency bandwidth. Considering the time-varying nature of seismic signals, we replace the constant wavelet with a multi-scale time-varying wavelet during deconvolution. Numerical tests show that this method can obtain good application results.
基金supported by the Teaching and Research Award Program for the Outstanding Young Teachers in Higher Education Institutesof Ministry of Educationthe Natural Science Foundation of Inner Mongolia Autonomous Region (2010BS0108)SPH-IMU (Z20090135)
文摘Two new formulaes of the main parameter βk of the conjugate gradient method are presented, which respectively can be seen as the modifications of method HS and PRP. In comparison with classic conjugate gradient methods, the new methods take both available gradient and function value information. Furthermore, their modifications are proposed. These methods are shown to be global convergent under some assumptions. Numerical results are also reported.
文摘We extend a results presented by Y.F. Hu and C.Storey (1991) [1] on the global convergence result for conjugate gradient methods with different choices for the parameter β k . In this note, the conditions given on β k are milder than that used by Y.F. Hu and C. Storey.
基金This work is supported by the National Natural Science Foundation of China(Grant No.11661009)the Guangxi Science Fund for Distinguished Young Scholars(No.2015GXNSFGA139001)+1 种基金the Guangxi Natural Science Key Fund(No.2017GXNSFDA198046)Innovation Project of Guangxi Graduate Education(No.YCSW2018046).
文摘It is well known that Newton and quasi-Newton algorithms are effective to small and medium scale smooth problems because they take full use of corresponding gradient function’s information but fail to solve nonsmooth problems.The perfect algorithm stems from concept of‘bundle’successfully addresses both smooth and nonsmooth complex problems,but it is regrettable that it is merely effective to small and medium optimization models since it needs to store and update relevant information of parameter’s bundle.The conjugate gradient algorithm is effective both large-scale smooth and nonsmooth optimization model since its simplicity that utilizes objective function’s information and the technique of Moreau-Yosida regularization.Thus,a modified three-term conjugate gradient algorithm was proposed,and it has a sufficiently descent property and a trust region character.At the same time,it possesses the global convergence under mild assumptions and numerical test proves it is efficient than similar optimization algorithms.