Many methods have been put forward to solve unconstrained optimization problems,among which conjugate gradient method(CG)is very important.With the increasing emergence of large⁃scale problems,the subspace technology ...Many methods have been put forward to solve unconstrained optimization problems,among which conjugate gradient method(CG)is very important.With the increasing emergence of large⁃scale problems,the subspace technology has become particularly important and widely used in the field of optimization.In this study,a new CG method was put forward,which combined subspace technology and a cubic regularization model.Besides,a special scaled norm in a cubic regularization model was analyzed.Under certain conditions,some significant characteristics of the search direction were given and the convergence of the algorithm was built.Numerical comparisons show that for the 145 test functions under the CUTEr library,the proposed method is better than two classical CG methods and two new subspaces conjugate gradient methods.展开更多
Crosswell seismic tomography can be used to study the lateral variation of reservoirs, reservoir properties and the dynamic movement of fluids. In view of the instability of crosswell seismic tomography, the gradient ...Crosswell seismic tomography can be used to study the lateral variation of reservoirs, reservoir properties and the dynamic movement of fluids. In view of the instability of crosswell seismic tomography, the gradient method was improved by introducing regularization, and a gradient regularization method is presented in this paper. This method was verified by processing numerical simulation data and physical model data.展开更多
In this paper,we consider a Cauchy problem of the time fractional diffusion equation(TFDE)in x∈[0,L].This problem is ubiquitous in science and engineering applications.The illposedness of the Cauchy problem is explai...In this paper,we consider a Cauchy problem of the time fractional diffusion equation(TFDE)in x∈[0,L].This problem is ubiquitous in science and engineering applications.The illposedness of the Cauchy problem is explained by its solution in frequency domain.Furthermore,the problem is formulated into a minimization problem with a modified Tikhonov regularization method.The gradient of the regularization functional based on an adjoint problem is deduced and the standard conjugate gradient method is presented for solving the minimization problem.The error estimates for the regularized solutions are obtained under Hp norm priori bound assumptions.Finally,numerical examples illustrate the effectiveness of the proposed method.展开更多
Image restoration is often solved by minimizing an energy function consisting of a data-fidelity term and a regularization term.A regularized convex term can usually preserve the image edges well in the restored image...Image restoration is often solved by minimizing an energy function consisting of a data-fidelity term and a regularization term.A regularized convex term can usually preserve the image edges well in the restored image.In this paper,we consider a class of convex and edge-preserving regularization functions,i.e.,multiplicative half-quadratic regularizations,and we use the Newton method to solve the correspondingly reduced systems of nonlinear equations.At each Newton iterate,the preconditioned conjugate gradient method,incorporated with a constraint preconditioner,is employed to solve the structured Newton equation that has a symmetric positive definite coefficient matrix. The eigenvalue bounds of the preconditioned matrix are deliberately derived,which can be used to estimate the convergence speed of the preconditioned conjugate gradient method.We use experimental results to demonstrate that this new approach is efficient, and the effect of image restoration is reasonably well.展开更多
This article is devoted to a time series prediction scheme involving the nonlinear autoregressive algorithm and its applications. The scheme is implemented by means of an artificial neural network containing a hidden ...This article is devoted to a time series prediction scheme involving the nonlinear autoregressive algorithm and its applications. The scheme is implemented by means of an artificial neural network containing a hidden layer. As a training algorithm we use scaled conjugate gradient (SCG) method and the Bayesian regularization (BReg) method. The first method is applied to time series without noise, while the second one can also be applied for noisy datasets. We apply the suggested scheme for prediction of time series arising in oil and gas pricing using 50 and 100 past values. Results of numerical simulations are presented and discussed.展开更多
A class of regularized conjugate gradient methods is presented for solving the large sparse system of linear equations of which the coefficient matrix is an ill-conditioned symmetric positive definite matrix. The conv...A class of regularized conjugate gradient methods is presented for solving the large sparse system of linear equations of which the coefficient matrix is an ill-conditioned symmetric positive definite matrix. The convergence properties of these methods are discussed in depth, and the best possible choices of the parameters involved in the new methods are investigated in detail. Numerical computations show that the new methods are more efficient and robust than both classical relaxation methods and classical conjugate direction methods.展开更多
基金Sponsored by the National Natural Science Foundation of China(Grant No.11901561).
文摘Many methods have been put forward to solve unconstrained optimization problems,among which conjugate gradient method(CG)is very important.With the increasing emergence of large⁃scale problems,the subspace technology has become particularly important and widely used in the field of optimization.In this study,a new CG method was put forward,which combined subspace technology and a cubic regularization model.Besides,a special scaled norm in a cubic regularization model was analyzed.Under certain conditions,some significant characteristics of the search direction were given and the convergence of the algorithm was built.Numerical comparisons show that for the 145 test functions under the CUTEr library,the proposed method is better than two classical CG methods and two new subspaces conjugate gradient methods.
文摘Crosswell seismic tomography can be used to study the lateral variation of reservoirs, reservoir properties and the dynamic movement of fluids. In view of the instability of crosswell seismic tomography, the gradient method was improved by introducing regularization, and a gradient regularization method is presented in this paper. This method was verified by processing numerical simulation data and physical model data.
基金Supported by the National Natural Science Foundation of China(Grant No.11471253 and No.11571311)
文摘In this paper,we consider a Cauchy problem of the time fractional diffusion equation(TFDE)in x∈[0,L].This problem is ubiquitous in science and engineering applications.The illposedness of the Cauchy problem is explained by its solution in frequency domain.Furthermore,the problem is formulated into a minimization problem with a modified Tikhonov regularization method.The gradient of the regularization functional based on an adjoint problem is deduced and the standard conjugate gradient method is presented for solving the minimization problem.The error estimates for the regularized solutions are obtained under Hp norm priori bound assumptions.Finally,numerical examples illustrate the effectiveness of the proposed method.
基金supported by the National Basic Research Program (No.2005CB321702)the National Outstanding Young Scientist Foundation(No. 10525102)the Specialized Research Grant for High Educational Doctoral Program(Nos. 20090211120011 and LZULL200909),Hong Kong RGC grants and HKBU FRGs
文摘Image restoration is often solved by minimizing an energy function consisting of a data-fidelity term and a regularization term.A regularized convex term can usually preserve the image edges well in the restored image.In this paper,we consider a class of convex and edge-preserving regularization functions,i.e.,multiplicative half-quadratic regularizations,and we use the Newton method to solve the correspondingly reduced systems of nonlinear equations.At each Newton iterate,the preconditioned conjugate gradient method,incorporated with a constraint preconditioner,is employed to solve the structured Newton equation that has a symmetric positive definite coefficient matrix. The eigenvalue bounds of the preconditioned matrix are deliberately derived,which can be used to estimate the convergence speed of the preconditioned conjugate gradient method.We use experimental results to demonstrate that this new approach is efficient, and the effect of image restoration is reasonably well.
文摘This article is devoted to a time series prediction scheme involving the nonlinear autoregressive algorithm and its applications. The scheme is implemented by means of an artificial neural network containing a hidden layer. As a training algorithm we use scaled conjugate gradient (SCG) method and the Bayesian regularization (BReg) method. The first method is applied to time series without noise, while the second one can also be applied for noisy datasets. We apply the suggested scheme for prediction of time series arising in oil and gas pricing using 50 and 100 past values. Results of numerical simulations are presented and discussed.
基金Subsidized by The Special Funds For Major State Basic Research Projects G1999032803.
文摘A class of regularized conjugate gradient methods is presented for solving the large sparse system of linear equations of which the coefficient matrix is an ill-conditioned symmetric positive definite matrix. The convergence properties of these methods are discussed in depth, and the best possible choices of the parameters involved in the new methods are investigated in detail. Numerical computations show that the new methods are more efficient and robust than both classical relaxation methods and classical conjugate direction methods.