期刊文献+

基于函数集信息量的改进网络算法仿真研究

Simulation of the Advanced Algorithm of Neural Networks Design Based on Function Set Information Quantity
下载PDF
导出
摘要 传统的神经网络学习算法往往存在欠学习或过学习情况,容易导致网络结构不够合理,预测函数泛化能力不理想.该文引用函数集信息量的概念,提出了以信噪比倒数为性能指标的改进型网络学习算法,深入分析了神经网络学习过程中欠学习和过学习的原因.经过仿真验证表明,该算法简单,自适应性强,收敛速度快,可以很好地克服欠学习和过学习问题,所得预测函数具有很好的泛化能力.
出处 《计算机仿真》 CSCD 2005年第z1期251-253,共3页 Computer Simulation
  • 相关文献

参考文献4

二级参考文献26

  • 1Sugiyama M, Ogawa H. Subspace information criterion for model selection. Neural Computation, 2001, 13(8): 1863 - 1889.
  • 2Stolke A. Bayesian learning of probabilistic language models.[Ph.D. Dissertation], University of California, Berkeley, 1994.
  • 3Hemant Ishwaran, Lancelot F, Jiayang Sun. Bayesian model selection in finite mixtures by marginal density decompositions.Journal of the American Statistical Association, 2001, 96(456):1316- 1332.
  • 4Cherkassky V, Shao X, Muller F M, Vapnik V N. Model complexity control for regression using VC generalization bounds IEEE Trans. on Neural Networks, 1999, 10(5): 1075- 1089.
  • 5Barron A R, Cover T M. Minimum complexity density estimation.IEEE Trans. on Information Theory, 1991,37(4): 1034- 1054.
  • 6Yamanishi K. A decision-theoretic extension of stochastic complexity and its application to learning. IEEE Trans. on Information Theory, 1998, 44(4): 1424 - 1439.
  • 7Wood S N. Modelling and smoothing parameter estimation with multiple quadratic penalties.J. Royel Statist. Soc. B, 2000, 62(1):413 - 428.
  • 8Chapelle O, Vapnik V N, Bengio Y. Model selection for small-sample regression. Machine Learning Journal, 2002, 48(I):9- 23.
  • 9Hurvich C M, Tsai C L. Regression and time series model selection in small samples. Biometrika, 1989, 76(13): 297 - 307.
  • 10Bousquet O, Elisseeff A. Stability and generalization. Journal of Machine Learning Research 2, 2002:499 - 526.

共引文献2276

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部