期刊文献+
共找到144篇文章
< 1 2 8 >
每页显示 20 50 100
Three-dimensional conjugate gradient inversion of magnetotelluric sounding data 被引量:4
1
作者 林昌洪 谭捍东 佟拓 《Applied Geophysics》 SCIE CSCD 2008年第4期314-321,共8页
Based on the analysis of the conjugate gradient algorithm, we implement a threedimensional (3D) conjugate gradient inversion algorithm with magnetotelluric impedance data. During the inversion process, the 3D conjug... Based on the analysis of the conjugate gradient algorithm, we implement a threedimensional (3D) conjugate gradient inversion algorithm with magnetotelluric impedance data. During the inversion process, the 3D conjugate gradient inversion algorithm doesn' t need to compute and store the Jacobian matrix but directly updates the model from the computation of the Jacobian matrix. Requiring only one forward and four pseudo-forward modeling applications per frequency to produce the model update at each iteration, this algorithm efficiently reduces the computation of the inversion. From a trial inversion with synthetic magnetotelluric data, the validity and stability of the 3D conjugate gradient inversion algorithm is verified. 展开更多
关键词 MAGNETOTELLURIC 3D INVERSION conjugate gradient
下载PDF
Blind Deconvolution Method Based on Precondition Conjugate Gradients 被引量:1
2
作者 朱振宇 裴江云 +2 位作者 吕小林 刘洪 李幼铭 《Petroleum Science》 SCIE CAS CSCD 2004年第3期37-40,共4页
In seismic data processing, blind deconvolution is a key technology. Introduced in this paper is a flow of one kind of blind deconvolution. The optimal precondition conjugate gradients (PCG) in Kyrlov subspace is als... In seismic data processing, blind deconvolution is a key technology. Introduced in this paper is a flow of one kind of blind deconvolution. The optimal precondition conjugate gradients (PCG) in Kyrlov subspace is also used to improve the stability of the algorithm. The computation amount is greatly decreased. 展开更多
关键词 Blind deconvolution precondition conjugate gradients (PCG) reflectivity series
下载PDF
Conjugate gradient and cross-correlation based least-square reverse time migration and its application 被引量:1
3
作者 孙小东 李振春 葛中慧 《Applied Geophysics》 SCIE CSCD 2017年第3期381-386,460,共7页
Although conventional reverse time migration can be perfectly applied to structural imaging it lacks the capability of enabling detailed delineation of a lithological reservoir due to irregular illumination. To obtain... Although conventional reverse time migration can be perfectly applied to structural imaging it lacks the capability of enabling detailed delineation of a lithological reservoir due to irregular illumination. To obtain reliable reflectivity of the subsurface it is necessary to solve the imaging problem using inversion. The least-square reverse time migration (LSRTM) (also known as linearized refleetivity inversion) aims to obtain relatively high-resolution amplitude preserving imaging by including the inverse of the Hessian matrix. In practice, the conjugate gradient algorithm is proven to be an efficient iterative method for enabling use of LSRTM. The velocity gradient can be derived from a cross-correlation between observed data and simulated data, making LSRTM independent of wavelet signature and thus more robust in practice. Tests on synthetic and marine data show that LSRTM has good potential for use in reservoir description and four-dimensional (4D) seismic images compared to traditional RTM and Fourier finite difference (FFD) migration. This paper investigates the first order approximation of LSRTM, which is also known as the linear Born approximation. However, for more complex geological structures a higher order approximation should be considered to improve imaging quality. 展开更多
关键词 Reverse time migration reflectivity Hessian matrix conjugate gradient
下载PDF
High-efciency improved symmetric successive over-relaxation preconditioned conjugate gradient method for solving large-scale finite element linear equations 被引量:1
4
作者 李根 唐春安 李连崇 《Applied Mathematics and Mechanics(English Edition)》 SCIE EI 2013年第10期1225-1236,共12页
Fast solving large-scale linear equations in the finite element analysis is a classical subject in computational mechanics. It is a key technique in computer aided engineering (CAE) and computer aided manufacturing ... Fast solving large-scale linear equations in the finite element analysis is a classical subject in computational mechanics. It is a key technique in computer aided engineering (CAE) and computer aided manufacturing (CAM). This paper presents a high-efficiency improved symmetric successive over-relaxation (ISSOR) preconditioned conjugate gradient (PCG) method, which maintains lelism consistent with the original form. Ideally, the by 50% as compared with the original algorithm. the convergence and inherent paralcomputation can It is suitable for be reduced nearly high-performance computing with its inherent basic high-efficiency operations. By comparing with the numerical results, it is shown that the proposed method has the best performance. 展开更多
关键词 improved preconditioned conjugate gradient (PCG) method conjugate gradient method large-scale linear equation finite element method
下载PDF
ab initio CALCULATION FOR THE ELECTRONIC STRUCTURE OF GaAs/Al_xGa_(1-x) As SUPERLATTICES: CONJUGATE GRADIENT APPROACH
5
作者 金英进 姜恩永 +2 位作者 金光日 金成规 任世伟 《Transactions of Tianjin University》 EI CAS 2001年第2期98-100,共3页
The electronic structure of GaAs/Al xGa 1-x As superlattices has been investigated by an ab initio calculation method—the conjugate gradient (CG) approach.In order to determine that,a conventional CG scheme is m... The electronic structure of GaAs/Al xGa 1-x As superlattices has been investigated by an ab initio calculation method—the conjugate gradient (CG) approach.In order to determine that,a conventional CG scheme is modified for our superlattices:First,apart from the former scheme,for the fixed electron density n(z),the eigenvalues and eigenfunctions are calculated,and then by using those,reconstruct the new n(z).Also,for every k z,we apply the CG schemes independently.The calculated energy difference between two minibands,and Fermi energy are in good agreement with the experimental data. 展开更多
关键词 electronic structure SUPERLATTICE ab initio calculation conjugate gradient approach
下载PDF
Three-dimensional conjugate gradient inversion of magnetotelluric full information data 被引量:9
6
作者 Lin Chang-Hong Tan Han-Dong Tong Tuo 《Applied Geophysics》 SCIE CSCD 2011年第1期1-10,94,共11页
Based on the analysis of impedance tensor data, tipper data, and the conjugate gradient algorithm, we develop a three-dimensional (3D) conjugate gradient algorithm for inverting magnetotelluric full information data... Based on the analysis of impedance tensor data, tipper data, and the conjugate gradient algorithm, we develop a three-dimensional (3D) conjugate gradient algorithm for inverting magnetotelluric full information data determined from five electric and magnetic field components and discuss the method to use the full information data for quantitative interpretation of 3D inversion results. Results from the 3D inversion of synthetic data indicate that the results from inverting full information data which combine the impedance tensor and tipper data are better than results from inverting only the impedance tensor data (or tipper data) in improving resolution and reliability. The synthetic examples also demonstrate the validity and stability of this 3D inversion algorithm. 展开更多
关键词 MAGNETOTELLURIC full information data 3D inversion conjugate gradient
下载PDF
Improved preconditioned conjugate gradient algorithm and application in 3D inversion of gravity-gradiometry data 被引量:9
7
作者 Wang Tai-Han Huang Da-Nian +2 位作者 Ma Guo-Qing Meng Zhao-Hai Li Ye 《Applied Geophysics》 SCIE CSCD 2017年第2期301-313,324,共14页
With the continuous development of full tensor gradiometer (FTG) measurement techniques, three-dimensional (3D) inversion of FTG data is becoming increasingly used in oil and gas exploration. In the fast processin... With the continuous development of full tensor gradiometer (FTG) measurement techniques, three-dimensional (3D) inversion of FTG data is becoming increasingly used in oil and gas exploration. In the fast processing and interpretation of large-scale high-precision data, the use of the graphics processing unit process unit (GPU) and preconditioning methods are very important in the data inversion. In this paper, an improved preconditioned conjugate gradient algorithm is proposed by combining the symmetric successive over-relaxation (SSOR) technique and the incomplete Choleksy decomposition conjugate gradient algorithm (ICCG). Since preparing the preconditioner requires extra time, a parallel implement based on GPU is proposed. The improved method is then applied in the inversion of noise- contaminated synthetic data to prove its adaptability in the inversion of 3D FTG data. Results show that the parallel SSOR-ICCG algorithm based on NVIDIA Tesla C2050 GPU achieves a speedup of approximately 25 times that of a serial program using a 2.0 GHz Central Processing Unit (CPU). Real airbome gravity-gradiometry data from Vinton salt dome (south- west Louisiana, USA) are also considered. Good results are obtained, which verifies the efficiency and feasibility of the proposed parallel method in fast inversion of 3D FTG data. 展开更多
关键词 Full Tensor Gravity Gradiometry (FTG) ICCG method conjugate gradient algorithm gravity-gradiometry data inversion CPU and GPU
下载PDF
Integrating Conjugate Gradients Into Evolutionary Algorithms for Large-Scale Continuous Multi-Objective Optimization 被引量:4
8
作者 Ye Tian Haowen Chen +3 位作者 Haiping Ma Xingyi Zhang Kay Chen Tan Yaochu Jin 《IEEE/CAA Journal of Automatica Sinica》 SCIE EI CSCD 2022年第10期1801-1817,共17页
Large-scale multi-objective optimization problems(LSMOPs)pose challenges to existing optimizers since a set of well-converged and diverse solutions should be found in huge search spaces.While evolutionary algorithms a... Large-scale multi-objective optimization problems(LSMOPs)pose challenges to existing optimizers since a set of well-converged and diverse solutions should be found in huge search spaces.While evolutionary algorithms are good at solving small-scale multi-objective optimization problems,they are criticized for low efficiency in converging to the optimums of LSMOPs.By contrast,mathematical programming methods offer fast convergence speed on large-scale single-objective optimization problems,but they have difficulties in finding diverse solutions for LSMOPs.Currently,how to integrate evolutionary algorithms with mathematical programming methods to solve LSMOPs remains unexplored.In this paper,a hybrid algorithm is tailored for LSMOPs by coupling differential evolution and a conjugate gradient method.On the one hand,conjugate gradients and differential evolution are used to update different decision variables of a set of solutions,where the former drives the solutions to quickly converge towards the Pareto front and the latter promotes the diversity of the solutions to cover the whole Pareto front.On the other hand,objective decomposition strategy of evolutionary multi-objective optimization is used to differentiate the conjugate gradients of solutions,and the line search strategy of mathematical programming is used to ensure the higher quality of each offspring than its parent.In comparison with state-of-the-art evolutionary algorithms,mathematical programming methods,and hybrid algorithms,the proposed algorithm exhibits better convergence and diversity performance on a variety of benchmark and real-world LSMOPs. 展开更多
关键词 conjugate gradient differential evolution evolutionary computation large-scale multi-objective optimization mathematical programming
下载PDF
A modified three–term conjugate gradient method with sufficient descent property 被引量:1
9
作者 Saman Babaie–Kafaki 《Applied Mathematics(A Journal of Chinese Universities)》 SCIE CSCD 2015年第3期263-272,共10页
A hybridization of the three–term conjugate gradient method proposed by Zhang et al. and the nonlinear conjugate gradient method proposed by Polak and Ribi`ere, and Polyak is suggested. Based on an eigenvalue analysi... A hybridization of the three–term conjugate gradient method proposed by Zhang et al. and the nonlinear conjugate gradient method proposed by Polak and Ribi`ere, and Polyak is suggested. Based on an eigenvalue analysis, it is shown that search directions of the proposed method satisfy the sufficient descent condition, independent of the line search and the objective function convexity. Global convergence of the method is established under an Armijo–type line search condition. Numerical experiments show practical efficiency of the proposed method. 展开更多
关键词 unconstrained optimization conjugate gradient method EIGENVALUE sufficient descent condition global convergence
下载PDF
A New Nonlinear Conjugate Gradient Method for Unconstrained Optimization Problems 被引量:1
10
作者 LIU Jin-kui WANG Kai-rong +1 位作者 SONG Xiao-qian DU Xiang-lin 《Chinese Quarterly Journal of Mathematics》 CSCD 2010年第3期444-450,共7页
In this paper,an efficient conjugate gradient method is given to solve the general unconstrained optimization problems,which can guarantee the sufficient descent property and the global convergence with the strong Wol... In this paper,an efficient conjugate gradient method is given to solve the general unconstrained optimization problems,which can guarantee the sufficient descent property and the global convergence with the strong Wolfe line search conditions.Numerical results show that the new method is efficient and stationary by comparing with PRP+ method,so it can be widely used in scientific computation. 展开更多
关键词 unconstrained optimization conjugate gradient method strong Wolfe line search sufficient descent property global convergence
下载PDF
Global Convergence of a New Restarting Conjugate Gradient Method for Nonlinear Optimizations 被引量:1
11
作者 SUN Qing-ying(Department of Applied Mathematics, Dalian University of Technology, Dalian 116024, China Department of Applied Mathematics, University of Petroleum , Dongying 257061, China) 《Chinese Quarterly Journal of Mathematics》 CSCD 2003年第2期154-162,共9页
Conjugate gradient optimization algorithms depend on the search directions with different choices for the parameters in the search directions. In this note, by combining the nice numerical performance of PR and HS met... Conjugate gradient optimization algorithms depend on the search directions with different choices for the parameters in the search directions. In this note, by combining the nice numerical performance of PR and HS methods with the global convergence property of the class of conjugate gradient methods presented by HU and STOREY(1991), a class of new restarting conjugate gradient methods is presented. Global convergences of the new method with two kinds of common line searches, are proved. Firstly, it is shown that, using reverse modulus of continuity function and forcing function, the new method for solving unconstrained optimization can work for a continously dif ferentiable function with Curry-Altman's step size rule and a bounded level set. Secondly, by using comparing technique, some general convergence properties of the new method with other kind of step size rule are established. Numerical experiments show that the new method is efficient by comparing with FR conjugate gradient method. 展开更多
关键词 nonlinear programming restarting conjugate gradient method forcing function reverse modulus of continuity function CONVERGENCE
下载PDF
Global Convergence of a New Restarting Three Terms Conjugate Gradient Method for Non-linear Optimizations 被引量:1
12
作者 SUN Qing-ying SANG Zhao-yang TIAN Feng-ting 《Chinese Quarterly Journal of Mathematics》 CSCD 2011年第1期69-76,共8页
In this note,by combining the nice numerical performance of PR and HS methods with the global convergence property of FR method,a class of new restarting three terms conjugate gradient methods is presented.Global conv... In this note,by combining the nice numerical performance of PR and HS methods with the global convergence property of FR method,a class of new restarting three terms conjugate gradient methods is presented.Global convergence properties of the new method with two kinds of common line searches are proved. 展开更多
关键词 nonlinear programming restarting three terms conjugate gradient method forcing function reverse modulus of continuity function convergence
下载PDF
Subspace Minimization Conjugate Gradient Method Based on Cubic Regularization Model for Unconstrained Optimization 被引量:1
13
作者 Ting Zhao Hongwei Liu 《Journal of Harbin Institute of Technology(New Series)》 CAS 2021年第5期61-69,共9页
Many methods have been put forward to solve unconstrained optimization problems,among which conjugate gradient method(CG)is very important.With the increasing emergence of large⁃scale problems,the subspace technology ... Many methods have been put forward to solve unconstrained optimization problems,among which conjugate gradient method(CG)is very important.With the increasing emergence of large⁃scale problems,the subspace technology has become particularly important and widely used in the field of optimization.In this study,a new CG method was put forward,which combined subspace technology and a cubic regularization model.Besides,a special scaled norm in a cubic regularization model was analyzed.Under certain conditions,some significant characteristics of the search direction were given and the convergence of the algorithm was built.Numerical comparisons show that for the 145 test functions under the CUTEr library,the proposed method is better than two classical CG methods and two new subspaces conjugate gradient methods. 展开更多
关键词 cubic regularization model conjugate gradient method subspace technique unconstrained optimization
下载PDF
Convergence Analysis on a Class of Nonmonotone Conjugate Gradient Methods without Sufficient Decrease Condition 被引量:1
14
作者 DUShou-qiang CHENYuan-yuan 《Chinese Quarterly Journal of Mathematics》 CSCD 2004年第2期142-145,共4页
In [3] Liu et al. investigated global convergence of conjugate gradient methods. In that paper they allowed βκ to be selected in a wider range and the global convergence of the corresponding algorithm without suffic... In [3] Liu et al. investigated global convergence of conjugate gradient methods. In that paper they allowed βκ to be selected in a wider range and the global convergence of the corresponding algorithm without sufficient decrease condition was proved. This paper investigates global convergence of nonmonotone conjugate gradient method under the same conditions. 展开更多
关键词 nonmonotone conjugate gradient global convergence nonmonotone line search
下载PDF
A New Conjugate Gradient Projection Method for Solving Stochastic Generalized Linear Complementarity Problems 被引量:2
15
作者 Zhimin Liu Shouqiang Du Ruiying Wang 《Journal of Applied Mathematics and Physics》 2016年第6期1024-1031,共8页
In this paper, a class of the stochastic generalized linear complementarity problems with finitely many elements is proposed for the first time. Based on the Fischer-Burmeister function, a new conjugate gradient proje... In this paper, a class of the stochastic generalized linear complementarity problems with finitely many elements is proposed for the first time. Based on the Fischer-Burmeister function, a new conjugate gradient projection method is given for solving the stochastic generalized linear complementarity problems. The global convergence of the conjugate gradient projection method is proved and the related numerical results are also reported. 展开更多
关键词 Stochastic Generalized Linear Complementarity Problems Fischer-Burmeister Function conjugate gradient Projection Method Global Convergence
下载PDF
A Hybrid Conjugate Gradient Algorithm for Solving Relative Orientation of Big Rotation Angle Stereo Pair 被引量:3
16
作者 Jiatian LI Congcong WANG +5 位作者 Chenglin JIA Yiru NIU Yu WANG Wenjing ZHANG Huajing WU Jian LI 《Journal of Geodesy and Geoinformation Science》 2020年第2期62-70,共9页
The fast convergence without initial value dependence is the key to solving large angle relative orientation.Therefore,a hybrid conjugate gradient algorithm is proposed in this paper.The concrete process is:①stochast... The fast convergence without initial value dependence is the key to solving large angle relative orientation.Therefore,a hybrid conjugate gradient algorithm is proposed in this paper.The concrete process is:①stochastic hill climbing(SHC)algorithm is used to make a random disturbance to the given initial value of the relative orientation element,and the new value to guarantee the optimization direction is generated.②In local optimization,a super-linear convergent conjugate gradient method is used to replace the steepest descent method in relative orientation to improve its convergence rate.③The global convergence condition is that the calculation error is less than the prescribed limit error.The comparison experiment shows that the method proposed in this paper is independent of the initial value,and has higher accuracy and fewer iterations. 展开更多
关键词 relative orientation big rotation angle global convergence stochastic hill climbing conjugate gradient algorithm
下载PDF
The preconditioned conjugate gradient deconvolution method and its application
17
作者 Xi Xiaoyu Liu Hong 《Applied Geophysics》 SCIE CSCD 2006年第3期156-162,共7页
The preconditioned conjugate gradient deconvolution method combines the realization of sparse deconvolution and the optimal preconditioned conjugate gradient method to invert to reflection coefficients. This method ca... The preconditioned conjugate gradient deconvolution method combines the realization of sparse deconvolution and the optimal preconditioned conjugate gradient method to invert to reflection coefficients. This method can enhance the frequency of seismic data processing and widen the valid frequency bandwidth. Considering the time-varying nature of seismic signals, we replace the constant wavelet with a multi-scale time-varying wavelet during deconvolution. Numerical tests show that this method can obtain good application results. 展开更多
关键词 Preconditioned conjugate gradient DECONVOLUTION multi-scale time-varying wavelet and high frequency restoration
下载PDF
New type of conjugate gradient algorithms for unconstrained optimization problems
18
作者 Caiying Wu Guoqing Chen 《Journal of Systems Engineering and Electronics》 SCIE EI CSCD 2010年第6期1000-1007,共8页
Two new formulaes of the main parameter βk of the conjugate gradient method are presented, which respectively can be seen as the modifications of method HS and PRP. In comparison with classic conjugate gradient metho... Two new formulaes of the main parameter βk of the conjugate gradient method are presented, which respectively can be seen as the modifications of method HS and PRP. In comparison with classic conjugate gradient methods, the new methods take both available gradient and function value information. Furthermore, their modifications are proposed. These methods are shown to be global convergent under some assumptions. Numerical results are also reported. 展开更多
关键词 conjugate gradient unconstrained optimization global convergence conjugacy condition.
下载PDF
A Note on Global Convergence Result for Conjugate Gradient Methods
19
作者 BAI Yan qin Department of Mathematics, College of Sciences, Shanghai University, Shanghai 200436, China 《Journal of Shanghai University(English Edition)》 CAS 2001年第1期15-19,共5页
We extend a results presented by Y.F. Hu and C.Storey (1991) [1] on the global convergence result for conjugate gradient methods with different choices for the parameter β k . In this note, the condit... We extend a results presented by Y.F. Hu and C.Storey (1991) [1] on the global convergence result for conjugate gradient methods with different choices for the parameter β k . In this note, the conditions given on β k are milder than that used by Y.F. Hu and C. Storey. 展开更多
关键词 conjugate gradient algorithm descent property global convergence restarting
下载PDF
A Modified Three-Term Conjugate Gradient Algorithm for Large-Scale Nonsmooth Convex Optimization
20
作者 Wujie Hu Gonglin Yuan Hongtruong Pham 《Computers, Materials & Continua》 SCIE EI 2020年第2期787-800,共14页
It is well known that Newton and quasi-Newton algorithms are effective to small and medium scale smooth problems because they take full use of corresponding gradient function’s information but fail to solve nonsmooth... It is well known that Newton and quasi-Newton algorithms are effective to small and medium scale smooth problems because they take full use of corresponding gradient function’s information but fail to solve nonsmooth problems.The perfect algorithm stems from concept of‘bundle’successfully addresses both smooth and nonsmooth complex problems,but it is regrettable that it is merely effective to small and medium optimization models since it needs to store and update relevant information of parameter’s bundle.The conjugate gradient algorithm is effective both large-scale smooth and nonsmooth optimization model since its simplicity that utilizes objective function’s information and the technique of Moreau-Yosida regularization.Thus,a modified three-term conjugate gradient algorithm was proposed,and it has a sufficiently descent property and a trust region character.At the same time,it possesses the global convergence under mild assumptions and numerical test proves it is efficient than similar optimization algorithms. 展开更多
关键词 conjugate gradient LARGE-SCALE trust region global convergence
下载PDF
上一页 1 2 8 下一页 到第
使用帮助 返回顶部