There have been many studies on the dense theorem of approximation by radial basis feedforword neural networks, and some approximation problems by Gaussian radial basis feedforward neural networks (GRBFNs) in some s...There have been many studies on the dense theorem of approximation by radial basis feedforword neural networks, and some approximation problems by Gaussian radial basis feedforward neural networks (GRBFNs) in some special function space have also been investigated. This paper considers the approximation by the GRBFNs in continuous function space. It is proved that the rate of approximation by GRNFNs with nd neurons to any continuous function f defined on a compact subset K R^d can be controlled by w(f, n^-1/2), where w(f, t) is the modulus of continuity of the function f.展开更多
基金Supported by National Natural Science Foundation of China(Grant Nos.61101240and61272023)the Zhejiang Provincial Natural Science Foundation of China(Grant No.Y6110117)
文摘There have been many studies on the dense theorem of approximation by radial basis feedforword neural networks, and some approximation problems by Gaussian radial basis feedforward neural networks (GRBFNs) in some special function space have also been investigated. This paper considers the approximation by the GRBFNs in continuous function space. It is proved that the rate of approximation by GRNFNs with nd neurons to any continuous function f defined on a compact subset K R^d can be controlled by w(f, n^-1/2), where w(f, t) is the modulus of continuity of the function f.