The optimal control problem for nonlinear interconnected large-scale dynamic systems is considered. A successive approximation approach for designing the optimal controller is proposed with respect to quadratic perfor...The optimal control problem for nonlinear interconnected large-scale dynamic systems is considered. A successive approximation approach for designing the optimal controller is proposed with respect to quadratic performance indexes. By using the approach, the high order, coupling,nonlinear two-point boundary value (TPBV) problem is transformed into a sequence of linear decoupling TPBV problems. It is proven that the TPBV problem sequence uniformly converges to the optimal control for nonlinear interconnected large-scale systems. A suboptimal control law is obtained by using a finite iterative result of the optimal control sequence.展开更多
This paper studies the optimal control with zero steady-state error problem for nonlinear large-scale systems affected by external persistent disturbances.The nonlinear large-scale system is transformed into N nonline...This paper studies the optimal control with zero steady-state error problem for nonlinear large-scale systems affected by external persistent disturbances.The nonlinear large-scale system is transformed into N nonlinear subsystems with interconnect terms.Based on the internal model principle,a disturbance compensator is constructed such that the ith subsystem with external persistent disturbances is transformed into an augmented subsystem without disturbances.According to the sensitivity approach,the optimal tracking control law for the ith nonlinear subsystem can be obtained.The optimal tracking control law for the nonlinear large-scale systems can be obtained.A numerical simulation shows that the method is effective.展开更多
In this paper we report a sparse truncated Newton algorithm for handling large-scale simple bound nonlinear constrained minimixation problem. The truncated Newton method is used to update the variables with indices ou...In this paper we report a sparse truncated Newton algorithm for handling large-scale simple bound nonlinear constrained minimixation problem. The truncated Newton method is used to update the variables with indices outside of the active set, while the projected gradient method is used to update the active variables. At each iterative level, the search direction consists of three parts, one of which is a subspace truncated Newton direction, the other two are subspace gradient and modified gradient directions. The subspace truncated Newton direction is obtained by solving a sparse system of linear equations. The global convergence and quadratic convergence rate of the algorithm are proved and some numerical tests are given.展开更多
In this paper, a new nonlinear conjugate gradient method is proposed for large-scale unconstrained optimization. The sufficient descent property holds without any line searches. We use some steplength technique which ...In this paper, a new nonlinear conjugate gradient method is proposed for large-scale unconstrained optimization. The sufficient descent property holds without any line searches. We use some steplength technique which ensures the Zoutendijk condition to be held, this method is proved to be globally convergent. Finally, we improve it, and do further analysis.展开更多
A class of nonlinear problems with real parameters is defined. Generally, in this class of problems, when the parametric values are very large, the problems become ill-posed and numerical difficulties are encountered ...A class of nonlinear problems with real parameters is defined. Generally, in this class of problems, when the parametric values are very large, the problems become ill-posed and numerical difficulties are encountered when trying to solve these problems. In this paper, the nonlinear problems are reformulated to overcome the numerical difficulties associated with large parametric values. A novel iterative algorithm, which is suitable for large scale problems and can be easily parallelized, is proposed to solve the reformulated problems. Numerical tests indicate that the proposed algorithm gives stable solutions. Convergence properties of the proposed algorithm are investigated. In the limiting case, when the corresponding constraint is exactly satisfied, the proposed method is equivalent to the standard augmented Lagrangian method.展开更多
This paper discusses the two-block large-scale nonconvex optimization problem with general linear constraints.Based on the ideas of splitting and sequential quadratic optimization(SQO),a new feasible descent method fo...This paper discusses the two-block large-scale nonconvex optimization problem with general linear constraints.Based on the ideas of splitting and sequential quadratic optimization(SQO),a new feasible descent method for the discussed problem is proposed.First,we consider the problem of quadratic optimal(QO)approximation associated with the current feasible iteration point,and we split the QO into two small-scale QOs which can be solved in parallel.Second,a feasible descent direction for the problem is obtained and a new SQO-type method is proposed,namely,splitting feasible SQO(SF-SQO)method.Moreover,under suitable conditions,we analyse the global convergence,strong convergence and rate of superlinear convergence of the SF-SQO method.Finally,preliminary numerical experiments regarding the economic dispatch of a power system are carried out,and these show that the SF-SQO method is promising.展开更多
A new two-level subspace method is proposed for solving the general unconstrained minimization formulations discretized from infinite-dimensional optimization problems. At each iteration, the algorithm executes either...A new two-level subspace method is proposed for solving the general unconstrained minimization formulations discretized from infinite-dimensional optimization problems. At each iteration, the algorithm executes either a direct step on the current level or a coarse subspace correction step. In the coarse subspace correction step, we augment the traditional coarse grid space by a two-dimensional subspace spanned by the coordinate direction and the gradient direction at the current point. Global convergence is proved and convergence rate is studied under some mild conditions on the discretized functions. Preliminary numerical experiments on a few variational problems show that our two-level subspace method is promising.展开更多
Accompanied by the advent of current big data ages,the scales of real world optimization problems with many decisive design variables are becoming much larger.Up to date,how to develop new optimization algorithms for ...Accompanied by the advent of current big data ages,the scales of real world optimization problems with many decisive design variables are becoming much larger.Up to date,how to develop new optimization algorithms for these large scale problems and how to expand the scalability of existing optimization algorithms have posed further challenges in the domain of bio-inspired computation.So addressing these complex large scale problems to produce truly useful results is one of the presently hottest topics.As a branch of the swarm intelligence based algorithms,particle swarm optimization (PSO) for coping with large scale problems and its expansively diverse applications have been in rapid development over the last decade years.This reviewpaper mainly presents its recent achievements and trends,and also highlights the existing unsolved challenging problems and key issues with a huge impact in order to encourage further more research in both large scale PSO theories and their applications in the forthcoming years.展开更多
The optimal experimental methods of a multidimensional dynamic programming, a large-scale linear and a complex nonlinear system are presented, which achieve a great step forward in the field of systems science. And ex...The optimal experimental methods of a multidimensional dynamic programming, a large-scale linear and a complex nonlinear system are presented, which achieve a great step forward in the field of systems science. And excellent results are obtained in the applications of complex water resource systems.展开更多
An efficient active-set approach is presented for both nonnegative and general linear programming by adding varying numbers of constraints at each iteration. Computational experiments demonstrate that the proposed app...An efficient active-set approach is presented for both nonnegative and general linear programming by adding varying numbers of constraints at each iteration. Computational experiments demonstrate that the proposed approach is significantly faster than previous active-set and standard linear programming algorithms.展开更多
In this paper we test different conjugate gradient (CG) methods for solving large-scale unconstrained optimization problems. The methods are divided in two groups: the first group includes five basic CG methods and th...In this paper we test different conjugate gradient (CG) methods for solving large-scale unconstrained optimization problems. The methods are divided in two groups: the first group includes five basic CG methods and the second five hybrid CG methods. A collection of medium-scale and large-scale test problems are drawn from a standard code of test problems, CUTE. The conjugate gradient methods are ranked according to the numerical results. Some remarks are given.展开更多
基金Supported by National Natural Science Foundation of P. R. China (60074001)the Natural Science Foundation of Shandong Province (Y2000G02)
文摘The optimal control problem for nonlinear interconnected large-scale dynamic systems is considered. A successive approximation approach for designing the optimal controller is proposed with respect to quadratic performance indexes. By using the approach, the high order, coupling,nonlinear two-point boundary value (TPBV) problem is transformed into a sequence of linear decoupling TPBV problems. It is proven that the TPBV problem sequence uniformly converges to the optimal control for nonlinear interconnected large-scale systems. A suboptimal control law is obtained by using a finite iterative result of the optimal control sequence.
基金supported by the National Natural Science Foundation of China(No.60574023)the Natural Science Foundation of Shandong Province(No.Z2005G01)
文摘This paper studies the optimal control with zero steady-state error problem for nonlinear large-scale systems affected by external persistent disturbances.The nonlinear large-scale system is transformed into N nonlinear subsystems with interconnect terms.Based on the internal model principle,a disturbance compensator is constructed such that the ith subsystem with external persistent disturbances is transformed into an augmented subsystem without disturbances.According to the sensitivity approach,the optimal tracking control law for the ith nonlinear subsystem can be obtained.The optimal tracking control law for the nonlinear large-scale systems can be obtained.A numerical simulation shows that the method is effective.
基金Supported by National Natural Science Foundation of China (F030101-60574021) and National "985" Project of China Executed in Xi'an Jiaotong University
基金The research was supported by the State Education Grant for Retumed Scholars
文摘In this paper we report a sparse truncated Newton algorithm for handling large-scale simple bound nonlinear constrained minimixation problem. The truncated Newton method is used to update the variables with indices outside of the active set, while the projected gradient method is used to update the active variables. At each iterative level, the search direction consists of three parts, one of which is a subspace truncated Newton direction, the other two are subspace gradient and modified gradient directions. The subspace truncated Newton direction is obtained by solving a sparse system of linear equations. The global convergence and quadratic convergence rate of the algorithm are proved and some numerical tests are given.
文摘In this paper, a new nonlinear conjugate gradient method is proposed for large-scale unconstrained optimization. The sufficient descent property holds without any line searches. We use some steplength technique which ensures the Zoutendijk condition to be held, this method is proved to be globally convergent. Finally, we improve it, and do further analysis.
文摘A class of nonlinear problems with real parameters is defined. Generally, in this class of problems, when the parametric values are very large, the problems become ill-posed and numerical difficulties are encountered when trying to solve these problems. In this paper, the nonlinear problems are reformulated to overcome the numerical difficulties associated with large parametric values. A novel iterative algorithm, which is suitable for large scale problems and can be easily parallelized, is proposed to solve the reformulated problems. Numerical tests indicate that the proposed algorithm gives stable solutions. Convergence properties of the proposed algorithm are investigated. In the limiting case, when the corresponding constraint is exactly satisfied, the proposed method is equivalent to the standard augmented Lagrangian method.
基金supported by the National Natural Science Foundation of China(12171106)the Natural Science Foundation of Guangxi Province(2020GXNSFDA238017 and 2018GXNSFFA281007)the Shanghai Sailing Program(21YF1430300)。
文摘This paper discusses the two-block large-scale nonconvex optimization problem with general linear constraints.Based on the ideas of splitting and sequential quadratic optimization(SQO),a new feasible descent method for the discussed problem is proposed.First,we consider the problem of quadratic optimal(QO)approximation associated with the current feasible iteration point,and we split the QO into two small-scale QOs which can be solved in parallel.Second,a feasible descent direction for the problem is obtained and a new SQO-type method is proposed,namely,splitting feasible SQO(SF-SQO)method.Moreover,under suitable conditions,we analyse the global convergence,strong convergence and rate of superlinear convergence of the SF-SQO method.Finally,preliminary numerical experiments regarding the economic dispatch of a power system are carried out,and these show that the SF-SQO method is promising.
文摘A new two-level subspace method is proposed for solving the general unconstrained minimization formulations discretized from infinite-dimensional optimization problems. At each iteration, the algorithm executes either a direct step on the current level or a coarse subspace correction step. In the coarse subspace correction step, we augment the traditional coarse grid space by a two-dimensional subspace spanned by the coordinate direction and the gradient direction at the current point. Global convergence is proved and convergence rate is studied under some mild conditions on the discretized functions. Preliminary numerical experiments on a few variational problems show that our two-level subspace method is promising.
文摘Accompanied by the advent of current big data ages,the scales of real world optimization problems with many decisive design variables are becoming much larger.Up to date,how to develop new optimization algorithms for these large scale problems and how to expand the scalability of existing optimization algorithms have posed further challenges in the domain of bio-inspired computation.So addressing these complex large scale problems to produce truly useful results is one of the presently hottest topics.As a branch of the swarm intelligence based algorithms,particle swarm optimization (PSO) for coping with large scale problems and its expansively diverse applications have been in rapid development over the last decade years.This reviewpaper mainly presents its recent achievements and trends,and also highlights the existing unsolved challenging problems and key issues with a huge impact in order to encourage further more research in both large scale PSO theories and their applications in the forthcoming years.
文摘The optimal experimental methods of a multidimensional dynamic programming, a large-scale linear and a complex nonlinear system are presented, which achieve a great step forward in the field of systems science. And excellent results are obtained in the applications of complex water resource systems.
文摘An efficient active-set approach is presented for both nonnegative and general linear programming by adding varying numbers of constraints at each iteration. Computational experiments demonstrate that the proposed approach is significantly faster than previous active-set and standard linear programming algorithms.
基金Research partially supported by Chinese NSF grants 19801033,19771047 and 10171104
文摘In this paper we test different conjugate gradient (CG) methods for solving large-scale unconstrained optimization problems. The methods are divided in two groups: the first group includes five basic CG methods and the second five hybrid CG methods. A collection of medium-scale and large-scale test problems are drawn from a standard code of test problems, CUTE. The conjugate gradient methods are ranked according to the numerical results. Some remarks are given.