期刊文献+

Exponential Continuous Non-Parametric Neural Identifier With Predefined Convergence Velocity

下载PDF
导出
摘要 This paper addresses the design of an exponential function-based learning law for artificial neural networks(ANNs)with continuous dynamics.The ANN structure is used to obtain a non-parametric model of systems with uncertainties,which are described by a set of nonlinear ordinary differential equations.Two novel adaptive algorithms with predefined exponential convergence rate adjust the weights of the ANN.The first algorithm includes an adaptive gain depending on the identification error which accelerated the convergence of the weights and promotes a faster convergence between the states of the uncertain system and the trajectories of the neural identifier.The second approach uses a time-dependent sigmoidal gain that forces the convergence of the identification error to an invariant set characterized by an ellipsoid.The generalized volume of this ellipsoid depends on the upper bounds of uncertainties,perturbations and modeling errors.The application of the invariant ellipsoid method yields to obtain an algorithm to reduce the volume of the convergence region for the identification error.Both adaptive algorithms are derived from the application of a non-standard exponential dependent function and an associated controlled Lyapunov function.Numerical examples demonstrate the improvements enforced by the algorithms introduced in this study by comparing the convergence settings concerning classical schemes with non-exponential continuous learning methods.The proposed identifiers overcome the results of the classical identifier achieving a faster convergence to an invariant set of smaller dimensions.
出处 《IEEE/CAA Journal of Automatica Sinica》 SCIE EI CSCD 2022年第6期1049-1060,共12页 自动化学报(英文版)
基金 supported by the National Polytechnic Institute(SIP-20221151,SIP-20220916)。
  • 相关文献

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部