4[4]O Chapelle, V N Vapnik, Y Bengio. Model selection for small-sample Regression[J]. Machine Learning Journal,2002,48(1):9-23.
二级参考文献26
1Sugiyama M, Ogawa H. Subspace information criterion for model selection. Neural Computation, 2001, 13(8): 1863 - 1889.
2Stolke A. Bayesian learning of probabilistic language models.[Ph.D. Dissertation], University of California, Berkeley, 1994.
3Hemant Ishwaran, Lancelot F, Jiayang Sun. Bayesian model selection in finite mixtures by marginal density decompositions.Journal of the American Statistical Association, 2001, 96(456):1316- 1332.
4Cherkassky V, Shao X, Muller F M, Vapnik V N. Model complexity control for regression using VC generalization bounds IEEE Trans. on Neural Networks, 1999, 10(5): 1075- 1089.
5Barron A R, Cover T M. Minimum complexity density estimation.IEEE Trans. on Information Theory, 1991,37(4): 1034- 1054.
6Yamanishi K. A decision-theoretic extension of stochastic complexity and its application to learning. IEEE Trans. on Information Theory, 1998, 44(4): 1424 - 1439.
7Wood S N. Modelling and smoothing parameter estimation with multiple quadratic penalties.J. Royel Statist. Soc. B, 2000, 62(1):413 - 428.
8Chapelle O, Vapnik V N, Bengio Y. Model selection for small-sample regression. Machine Learning Journal, 2002, 48(I):9- 23.
9Hurvich C M, Tsai C L. Regression and time series model selection in small samples. Biometrika, 1989, 76(13): 297 - 307.
10Bousquet O, Elisseeff A. Stability and generalization. Journal of Machine Learning Research 2, 2002:499 - 526.