摘要
生存核的计算是控制理论中的一个重要研究方向.给出了一种计算一般离散控制系统生存核的新算法.基于机器学习的方法,给出了逼近生存核的算法.并在一定条件下,证明了此算法的收敛性.此算法在一定程度上避免了计算量随控制空间的维数增长而指数增长的问题.最后,给出具体的实际例子来说明算法的有效性.
The computation of the viability kernel is an important topic in control theory community. In this paper, we propose a new algorithm that computes the viability kernel of a discrete-time system. Based on the theory of machine learning, the algorithm of approximating viability kernel is presented. We give some conditions that guarantee the convergence of the approximations towards the actual viable kernel. This method avoids the exponential growth of the computing time with the dimension of the control space. Finally, examples are given to illustrate this result.
出处
《运筹学学报》
CSCD
北大核心
2013年第4期24-32,共9页
Operations Research Transactions
基金
国家自然科学基金(Nos.11171221
40901241)
上海市科委与地方院校能力建设(10550500800)
上海市一流学科(XTKX2012)
关键词
离散系统
生存核
机器学习
discrete-time systems, viability kernel, machine learning