The adaptive systems theory to be presented in this paper consists of two closely related parts: adaptive estimation (or filtering, prediction) and adaptive control of dynamical systems. Both adaptive estimation and c...The adaptive systems theory to be presented in this paper consists of two closely related parts: adaptive estimation (or filtering, prediction) and adaptive control of dynamical systems. Both adaptive estimation and control are nonlinear mappings of the on-line observed signals of dynamical systems, where the main features are the uncertain-ties in both the system's structure and external disturbances, and the non-stationarity and dependency of the system signals. Thus, a key difficulty in establishing a mathematical theory of adaptive systems lies in how to deal with complicated nonlinear stochastic dynamical systems which describe the adaptation processes. In this paper, we will illustrate some of the basic concepts, methods and results through some simple examples. The following fundamental questions will be discussed: How much information is needed for estimation? How to deal with uncertainty by adaptation? How to analyze an adaptive system? What are the convergence or tracking performances of adaptation? How to find the proper rate of adaptation in some sense? We will also explore the following more fundamental questions: How much uncertainty can be dealt with by adaptation ? What are the limitations of adaptation ? How does the performance of adaptation depend on the prior information ? We will partially answer these questions by finding some 'critical values' and establishing some 'Impossibility Theorems' for the capability of adaptation, for several basic classes of nonlinear dynamical control systems with either parametric or nonparametric uncertainties.展开更多
基金This work is supported by the National Natural Science Foundation of China and the National Key Project of China.This paper is based on the presentation at the International Symposium on"Intervention and Adaptation in Complex Systems"held in Beijing from
文摘The adaptive systems theory to be presented in this paper consists of two closely related parts: adaptive estimation (or filtering, prediction) and adaptive control of dynamical systems. Both adaptive estimation and control are nonlinear mappings of the on-line observed signals of dynamical systems, where the main features are the uncertain-ties in both the system's structure and external disturbances, and the non-stationarity and dependency of the system signals. Thus, a key difficulty in establishing a mathematical theory of adaptive systems lies in how to deal with complicated nonlinear stochastic dynamical systems which describe the adaptation processes. In this paper, we will illustrate some of the basic concepts, methods and results through some simple examples. The following fundamental questions will be discussed: How much information is needed for estimation? How to deal with uncertainty by adaptation? How to analyze an adaptive system? What are the convergence or tracking performances of adaptation? How to find the proper rate of adaptation in some sense? We will also explore the following more fundamental questions: How much uncertainty can be dealt with by adaptation ? What are the limitations of adaptation ? How does the performance of adaptation depend on the prior information ? We will partially answer these questions by finding some 'critical values' and establishing some 'Impossibility Theorems' for the capability of adaptation, for several basic classes of nonlinear dynamical control systems with either parametric or nonparametric uncertainties.