The problem of delay-dependent asymptotic stability for neurM networks with interval time-varying delay is investigated. Based on the idea of delay decomposition method, a new type of Lyapunov Krasovskii functional is...The problem of delay-dependent asymptotic stability for neurM networks with interval time-varying delay is investigated. Based on the idea of delay decomposition method, a new type of Lyapunov Krasovskii functional is constructed. Several novel delay-dependent stability criteria are presented in terms of linear matrix inequality by using the Jensen integral inequality and a new convex combination technique. Numerical examples are given to demonstrate that the proposed method is effective and less conservative.展开更多
In this paper, we introduce a type of approximation operators of neural networks with sigmodal functions on compact intervals, and obtain the pointwise and uniform estimates of the ap- proximation. To improve the appr...In this paper, we introduce a type of approximation operators of neural networks with sigmodal functions on compact intervals, and obtain the pointwise and uniform estimates of the ap- proximation. To improve the approximation rate, we further introduce a type of combinations of neurM networks. Moreover, we show that the derivatives of functions can also be simultaneously approximated by the derivatives of the combinations. We also apply our method to construct approximation operators of neural networks with sigmodal functions on infinite intervals.展开更多
基金supported by the Doctoral Startup Foundation of Taiyuan University of Science and Technology,China (Grant No. 20112010)
文摘The problem of delay-dependent asymptotic stability for neurM networks with interval time-varying delay is investigated. Based on the idea of delay decomposition method, a new type of Lyapunov Krasovskii functional is constructed. Several novel delay-dependent stability criteria are presented in terms of linear matrix inequality by using the Jensen integral inequality and a new convex combination technique. Numerical examples are given to demonstrate that the proposed method is effective and less conservative.
基金Supported by National Natural Science Foundation of China(Grant No.10901044)Qianjiang Rencai Program of Zhejiang Province(Grant No.2010R10101)+1 种基金Scientific Research Foundation for the Returned Overseas Chinese Scholars,State Education MinistryProgram for Excellent Young Teachers in Hangzhou Normal University
文摘In this paper, we introduce a type of approximation operators of neural networks with sigmodal functions on compact intervals, and obtain the pointwise and uniform estimates of the ap- proximation. To improve the approximation rate, we further introduce a type of combinations of neurM networks. Moreover, we show that the derivatives of functions can also be simultaneously approximated by the derivatives of the combinations. We also apply our method to construct approximation operators of neural networks with sigmodal functions on infinite intervals.