There have been many studies on the dense theorem of approximation by radial basis feedforword neural networks, and some approximation problems by Gaussian radial basis feedforward neural networks (GRBFNs) in some s...There have been many studies on the dense theorem of approximation by radial basis feedforword neural networks, and some approximation problems by Gaussian radial basis feedforward neural networks (GRBFNs) in some special function space have also been investigated. This paper considers the approximation by the GRBFNs in continuous function space. It is proved that the rate of approximation by GRNFNs with nd neurons to any continuous function f defined on a compact subset K R^d can be controlled by w(f, n^-1/2), where w(f, t) is the modulus of continuity of the function f.展开更多
In 1991, Hornik proved that the collection of single hidden layer feedforward neural net- works (SLFNs) with continuous, bounded, and non-constant activation function a is dense in C(K) where K is a compact set in...In 1991, Hornik proved that the collection of single hidden layer feedforward neural net- works (SLFNs) with continuous, bounded, and non-constant activation function a is dense in C(K) where K is a compact set in Rs (see Neural Networks, 4(2), 251-257 (1991)). Meanwhile, he pointed out "Whether or not the continuity assumption can entirely be dropped is still an open quite challenging problem". This paper replies in the affirmative to the problem and proves that for bounded and continuous almost everywhere (a.e.) activation function σ on R, the collection of SLFNs is dense in C(K) if and only if a is un-constant a.e..展开更多
基金Supported by National Natural Science Foundation of China(Grant Nos.61101240and61272023)the Zhejiang Provincial Natural Science Foundation of China(Grant No.Y6110117)
文摘There have been many studies on the dense theorem of approximation by radial basis feedforword neural networks, and some approximation problems by Gaussian radial basis feedforward neural networks (GRBFNs) in some special function space have also been investigated. This paper considers the approximation by the GRBFNs in continuous function space. It is proved that the rate of approximation by GRNFNs with nd neurons to any continuous function f defined on a compact subset K R^d can be controlled by w(f, n^-1/2), where w(f, t) is the modulus of continuity of the function f.
基金Supported by National Natural Science Foundation of China(Grant Nos.61272023,91330118)
文摘In 1991, Hornik proved that the collection of single hidden layer feedforward neural net- works (SLFNs) with continuous, bounded, and non-constant activation function a is dense in C(K) where K is a compact set in Rs (see Neural Networks, 4(2), 251-257 (1991)). Meanwhile, he pointed out "Whether or not the continuity assumption can entirely be dropped is still an open quite challenging problem". This paper replies in the affirmative to the problem and proves that for bounded and continuous almost everywhere (a.e.) activation function σ on R, the collection of SLFNs is dense in C(K) if and only if a is un-constant a.e..