Conjugate gradient optimization algorithms depend on the search directions with different choices for the parameters in the search directions. In this note, by combining the nice numerical performance of PR and HS met...Conjugate gradient optimization algorithms depend on the search directions with different choices for the parameters in the search directions. In this note, by combining the nice numerical performance of PR and HS methods with the global convergence property of the class of conjugate gradient methods presented by HU and STOREY(1991), a class of new restarting conjugate gradient methods is presented. Global convergences of the new method with two kinds of common line searches, are proved. Firstly, it is shown that, using reverse modulus of continuity function and forcing function, the new method for solving unconstrained optimization can work for a continously dif ferentiable function with Curry-Altman's step size rule and a bounded level set. Secondly, by using comparing technique, some general convergence properties of the new method with other kind of step size rule are established. Numerical experiments show that the new method is efficient by comparing with FR conjugate gradient method.展开更多
In this note,by combining the nice numerical performance of PR and HS methods with the global convergence property of FR method,a class of new restarting three terms conjugate gradient methods is presented.Global conv...In this note,by combining the nice numerical performance of PR and HS methods with the global convergence property of FR method,a class of new restarting three terms conjugate gradient methods is presented.Global convergence properties of the new method with two kinds of common line searches are proved.展开更多
In this paper,we construct a new algorithm which combines the conjugate gradient and Lanczos methods for solving nonlinear systems.The iterative direction can be obtained by solving a quadratic model via conjugate gra...In this paper,we construct a new algorithm which combines the conjugate gradient and Lanczos methods for solving nonlinear systems.The iterative direction can be obtained by solving a quadratic model via conjugate gradient and Lanczos methods.Using the backtracking line search,we will find an acceptable trial step size along this direction which makes the objective function nonmonotonically decreasing and makes the norm of the step size monotonically increasing.Global convergence and local superlinear convergence rate of the proposed algorithm are established under some reasonable conditions.Finally,we present some numerical results to illustrate the effectiveness of the proposed algorithm.展开更多
In this paper, the discrete-time static output feedback control design problem is con- sidered. A nonlinear conjugate gradient method is analyzed and studied for solving an unconstrained matrix optimization problem th...In this paper, the discrete-time static output feedback control design problem is con- sidered. A nonlinear conjugate gradient method is analyzed and studied for solving an unconstrained matrix optimization problem that results from this optimal control prob- lem. In addition, through certain parametrization to the optimization problem an initial stabilizing static output feedback gain matrix is not required to start the conjugate gradi- ent method. Finally, the proposed algorithms are tested numerically through several test problems from the benchmark collection.展开更多
This article is devoted to a time series prediction scheme involving the nonlinear autoregressive algorithm and its applications. The scheme is implemented by means of an artificial neural network containing a hidden ...This article is devoted to a time series prediction scheme involving the nonlinear autoregressive algorithm and its applications. The scheme is implemented by means of an artificial neural network containing a hidden layer. As a training algorithm we use scaled conjugate gradient (SCG) method and the Bayesian regularization (BReg) method. The first method is applied to time series without noise, while the second one can also be applied for noisy datasets. We apply the suggested scheme for prediction of time series arising in oil and gas pricing using 50 and 100 past values. Results of numerical simulations are presented and discussed.展开更多
In this paper,a three-term derivative-free projection method is proposed for solving nonlinear monotone equations.Under someappropriate conditions,the global convergence and R-linear convergence rate of the proposed m...In this paper,a three-term derivative-free projection method is proposed for solving nonlinear monotone equations.Under someappropriate conditions,the global convergence and R-linear convergence rate of the proposed method are analyzed and proved.With no need of any derivative information,the proposed method is able to solve large-scale nonlinear monotone equations.Numerical comparisons show that the proposed method is effective.展开更多
文摘Conjugate gradient optimization algorithms depend on the search directions with different choices for the parameters in the search directions. In this note, by combining the nice numerical performance of PR and HS methods with the global convergence property of the class of conjugate gradient methods presented by HU and STOREY(1991), a class of new restarting conjugate gradient methods is presented. Global convergences of the new method with two kinds of common line searches, are proved. Firstly, it is shown that, using reverse modulus of continuity function and forcing function, the new method for solving unconstrained optimization can work for a continously dif ferentiable function with Curry-Altman's step size rule and a bounded level set. Secondly, by using comparing technique, some general convergence properties of the new method with other kind of step size rule are established. Numerical experiments show that the new method is efficient by comparing with FR conjugate gradient method.
基金Supported by the National Natural Science Foundation of China(10571106) Supported by the Fundamental Research Funds for the Central Universities(10CX04044A)
文摘In this note,by combining the nice numerical performance of PR and HS methods with the global convergence property of FR method,a class of new restarting three terms conjugate gradient methods is presented.Global convergence properties of the new method with two kinds of common line searches are proved.
基金supports of the National Natural Science Foundation of China(No.11371253).
文摘In this paper,we construct a new algorithm which combines the conjugate gradient and Lanczos methods for solving nonlinear systems.The iterative direction can be obtained by solving a quadratic model via conjugate gradient and Lanczos methods.Using the backtracking line search,we will find an acceptable trial step size along this direction which makes the objective function nonmonotonically decreasing and makes the norm of the step size monotonically increasing.Global convergence and local superlinear convergence rate of the proposed algorithm are established under some reasonable conditions.Finally,we present some numerical results to illustrate the effectiveness of the proposed algorithm.
文摘In this paper, the discrete-time static output feedback control design problem is con- sidered. A nonlinear conjugate gradient method is analyzed and studied for solving an unconstrained matrix optimization problem that results from this optimal control prob- lem. In addition, through certain parametrization to the optimization problem an initial stabilizing static output feedback gain matrix is not required to start the conjugate gradi- ent method. Finally, the proposed algorithms are tested numerically through several test problems from the benchmark collection.
文摘This article is devoted to a time series prediction scheme involving the nonlinear autoregressive algorithm and its applications. The scheme is implemented by means of an artificial neural network containing a hidden layer. As a training algorithm we use scaled conjugate gradient (SCG) method and the Bayesian regularization (BReg) method. The first method is applied to time series without noise, while the second one can also be applied for noisy datasets. We apply the suggested scheme for prediction of time series arising in oil and gas pricing using 50 and 100 past values. Results of numerical simulations are presented and discussed.
文摘In this paper,a three-term derivative-free projection method is proposed for solving nonlinear monotone equations.Under someappropriate conditions,the global convergence and R-linear convergence rate of the proposed method are analyzed and proved.With no need of any derivative information,the proposed method is able to solve large-scale nonlinear monotone equations.Numerical comparisons show that the proposed method is effective.
基金Supported by the National Natural Science Foundation of China(72071202)Postgraduate Research&Practice Innovation Program of Jiangsu Province(KYCX22_2491)+1 种基金Graduate Innovation Program of China University of Mining and Technology(2022WLKXJ021)Undergraduate Training Program for Innovation and Entrepreneurial,China University of Mining and Technology(202210290205Y)。