期刊文献+

高斯核正则化学习算法的泛化误差

Generalization Bounds of Compressed Regression Learning Algorithm
下载PDF
导出
摘要 对广义凸损失函数和变高斯核情形下正则化学习算法的泛化性能展开研究.其目标是给出学习算法泛化误差的一个较为满意上界.泛化误差可以利用正则误差和样本误差来测定.基于高斯核的特性,通过构构建一个径向基函数(简记为RBF)神经网络,给出了正则误差的上界估计,通过投影算子和再生高斯核希尔伯特空间的覆盖数给出样本误差的上界估计.所获结果表明,通过适当选取参数σ和λ,可以提高学习算法的泛化性能. This article studies generalization performance of regularized learning algorithm with a general convex loss function and varying Gaussian kernels. Our goal is to give a sat- isfactory estimate of generalization error for the learning algorithm. The generalization error is measured by regularization error and sample error. The regularization error is estimated by constructing a radial basis function (briefly denoted by RBF) neural network in view of the special structures of Gaussian kernels. The sample error is obtained by using projection operator and covering number of reproducing kernel Hilbert spaces with Gaussian kernels. The obtained results demonstrate the learning algorithm has good generalization performance with suitable choice of ra and λ.
出处 《数学物理学报(A辑)》 CSCD 北大核心 2014年第5期1049-1060,共12页 Acta Mathematica Scientia
基金 国家自然科学基金(11301494) 浙江省自然科学基金(Q12A01026)资助
关键词 学习理论 RBF神经网络 高斯核 泛化误差 Learning theory RBF neural network Gaussian kernels Generalization error.
  • 相关文献

参考文献1

二级参考文献25

  • 1Hartman, E. J., Keeler, J. D., Kowalski, J. M.: Layered neural networks with Gaussian hidden units as universal approximations. Neural Comput., 2, 210-215 (1990).
  • 2Park, J., Sandberg, I. W.: Universal approximation using radial-basis function networks. Neural Comput., 3, 246-257 (1991).
  • 3Park, J., Sandberg, I. W.: Approximation and radial-basis function networks. Neural Comput., 5, 305-316 (1993).
  • 4Chen, T. P., Chen, H.: Approximation capability to functions of several variables, nonlinear functionals and operators by radial basis function neural networks. IEEE Trans. Neural Netw., 6, 904-910 (1995).
  • 5Jiang, C. H., Chen, T. P.: Approximation problems of translation invariant operator in Sobolev space Wn(Rd). Chin. Annals Math., Set. A, 20, 499-504 (1999).
  • 6Jiang, C. H., Chen, T. P.: Denseness of dilations and translations of a single function. Acta Mathematica Sinica, Chinese Series, 42, 495 500 (1999).
  • 7Liao, Y., Fang, S. C., Nuttle, H. L. W.: Relaxed conditions for radial-basis function networks to be universal approximators. Neural Netw., 16, 1019-1028 (2003).
  • 8Li, X.: On simultaneous approximations by radial basis function neural networks. Appl. Math. Comput., 95, 75-89 (1998).
  • 9Powell, M. J. D.: The theory of radial basis approximation. In: Advances in Numerical Analysis (ed. W. A. Light), Vol. 2, Oxford University Press, Oxford, 1990, 105-210.
  • 10Light, W. A., Wayne, H. S. J.: Some aspects of radial basis function approximation. In: Approximation Theory, Spline Functions and Applications (S. P. Singh Ed.), Kluwer Academic, Dordrecht, 1995, 163 190.

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部