期刊文献+

The Rate of Approximation of Gaussian Radial Basis Neural Networks in Continuous Function Space 被引量:1

The Rate of Approximation of Gaussian Radial Basis Neural Networks in Continuous Function Space
原文传递
导出
摘要 There have been many studies on the dense theorem of approximation by radial basis feedforword neural networks, and some approximation problems by Gaussian radial basis feedforward neural networks (GRBFNs) in some special function space have also been investigated. This paper considers the approximation by the GRBFNs in continuous function space. It is proved that the rate of approximation by GRNFNs with nd neurons to any continuous function f defined on a compact subset K R^d can be controlled by w(f, n^-1/2), where w(f, t) is the modulus of continuity of the function f. There have been many studies on the dense theorem of approximation by radial basis feedforword neural networks, and some approximation problems by Gaussian radial basis feedforward neural networks (GRBFNs) in some special function space have also been investigated. This paper considers the approximation by the GRBFNs in continuous function space. It is proved that the rate of approximation by GRNFNs with nd neurons to any continuous function f defined on a compact subset K R^d can be controlled by w(f, n^-1/2), where w(f, t) is the modulus of continuity of the function f.
出处 《Acta Mathematica Sinica,English Series》 SCIE CSCD 2013年第2期295-302,共8页 数学学报(英文版)
基金 Supported by National Natural Science Foundation of China(Grant Nos.61101240and61272023) the Zhejiang Provincial Natural Science Foundation of China(Grant No.Y6110117)
关键词 Gaussian radial basis feedforward neural networks APPROXIMATION rate of convergence modulus of continuity Gaussian radial basis feedforward neural networks, approximation, rate of convergence, modulus of continuity
  • 相关文献

参考文献25

  • 1Hartman, E. J., Keeler, J. D., Kowalski, J. M.: Layered neural networks with Gaussian hidden units as universal approximations. Neural Comput., 2, 210-215 (1990).
  • 2Park, J., Sandberg, I. W.: Universal approximation using radial-basis function networks. Neural Comput., 3, 246-257 (1991).
  • 3Park, J., Sandberg, I. W.: Approximation and radial-basis function networks. Neural Comput., 5, 305-316 (1993).
  • 4Chen, T. P., Chen, H.: Approximation capability to functions of several variables, nonlinear functionals and operators by radial basis function neural networks. IEEE Trans. Neural Netw., 6, 904-910 (1995).
  • 5Jiang, C. H., Chen, T. P.: Approximation problems of translation invariant operator in Sobolev space Wn(Rd). Chin. Annals Math., Set. A, 20, 499-504 (1999).
  • 6Jiang, C. H., Chen, T. P.: Denseness of dilations and translations of a single function. Acta Mathematica Sinica, Chinese Series, 42, 495 500 (1999).
  • 7Liao, Y., Fang, S. C., Nuttle, H. L. W.: Relaxed conditions for radial-basis function networks to be universal approximators. Neural Netw., 16, 1019-1028 (2003).
  • 8Li, X.: On simultaneous approximations by radial basis function neural networks. Appl. Math. Comput., 95, 75-89 (1998).
  • 9Powell, M. J. D.: The theory of radial basis approximation. In: Advances in Numerical Analysis (ed. W. A. Light), Vol. 2, Oxford University Press, Oxford, 1990, 105-210.
  • 10Light, W. A., Wayne, H. S. J.: Some aspects of radial basis function approximation. In: Approximation Theory, Spline Functions and Applications (S. P. Singh Ed.), Kluwer Academic, Dordrecht, 1995, 163 190.

引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部