期刊文献+
共找到2篇文章
< 1 >
每页显示 20 50 100
The Rate of Approximation of Gaussian Radial Basis Neural Networks in Continuous Function Space 被引量:1
1
作者 Ting Fan XIE fei long cao 《Acta Mathematica Sinica,English Series》 SCIE CSCD 2013年第2期295-302,共8页
There have been many studies on the dense theorem of approximation by radial basis feedforword neural networks, and some approximation problems by Gaussian radial basis feedforward neural networks (GRBFNs) in some s... There have been many studies on the dense theorem of approximation by radial basis feedforword neural networks, and some approximation problems by Gaussian radial basis feedforward neural networks (GRBFNs) in some special function space have also been investigated. This paper considers the approximation by the GRBFNs in continuous function space. It is proved that the rate of approximation by GRNFNs with nd neurons to any continuous function f defined on a compact subset K R^d can be controlled by w(f, n^-1/2), where w(f, t) is the modulus of continuity of the function f. 展开更多
关键词 Gaussian radial basis feedforward neural networks APPROXIMATION rate of convergence modulus of continuity
原文传递
On a Problem of Hornik
2
作者 Ting Fan XIE fei long cao 《Acta Mathematica Sinica,English Series》 SCIE CSCD 2015年第7期1141-1148,共8页
In 1991, Hornik proved that the collection of single hidden layer feedforward neural net- works (SLFNs) with continuous, bounded, and non-constant activation function a is dense in C(K) where K is a compact set in... In 1991, Hornik proved that the collection of single hidden layer feedforward neural net- works (SLFNs) with continuous, bounded, and non-constant activation function a is dense in C(K) where K is a compact set in Rs (see Neural Networks, 4(2), 251-257 (1991)). Meanwhile, he pointed out "Whether or not the continuity assumption can entirely be dropped is still an open quite challenging problem". This paper replies in the affirmative to the problem and proves that for bounded and continuous almost everywhere (a.e.) activation function σ on R, the collection of SLFNs is dense in C(K) if and only if a is un-constant a.e.. 展开更多
关键词 Neural networks APPROXIMATION activation function
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部