期刊文献+

高维回归中基于组块3×2交叉验证的调节参数选择

Tuning Parameter Selection Based on Blocked 3×2 Cross-Validation in High Dimensional Regression
下载PDF
导出
摘要 将组块3×2交叉验证方法用于高维回归中的调节参数选择.首先通过ISIS方法把模型的维数降低到样本个数以内,然后使用AENET方法对降维后的模型进行进一步的降维和参数估计,使用组块3×2交叉验证方法选择最佳的调节参数.综合考虑模拟实验中各种调节参数选择方法(AIC、BIC、EBIC、HBIC、5折交叉验证、组块3×2交叉验证)的EMSE值、方差以及计算复杂度,结果表明基于组块3×2交叉验证的方法是有其优势的. In the traditional regression model,the fold cross-validation can identify the true model eter selection methods. However,these criterions information criterions (AIC,BIC) and standard K consistently as the commonly used tuning param- tend to fail when meeting high dimensional data. Recent research shows that the 2-fold cross-validation has some advantages on the computation complexity,model selection and comparisons of models performances,especially the blocked 3 ×2 cross-validation newly proposed in the literature. Thus,we apply the blocked 3×2 cross-validation to the tuning parameter selection in high dimensional regression. First,the model dimension is re- duced to a scale with smaller than sample size by ISIS method. Then, the dimension reduced model is further to be reduced dimention and estimated parameters by AENET. And the tuning parameter is selected by using the blocked 3×2 cross-validation. Taking all factors into considera- tion of the EMSE values,variance and computation complexity of various tuning parameter selec- tion methods (AIC,BIC,EBIC, HBIC,5-fold cross-validation and blocked 3 × 2 cross-validation) in simulated experiments,the blocked 3×2 cross-validation method is comparable.
出处 《云南师范大学学报(自然科学版)》 2015年第3期27-32,共6页 Journal of Yunnan Normal University:Natural Sciences Edition
基金 山西省科技基础条件平台建设资助项目(20130910030101)
关键词 调节参数选择 组块3×2交叉验证 EMSE准则 Tuning parameter selection Blocked 3×2 Cross-Validation EMSE criterion
  • 相关文献

参考文献18

  • 1TIBSHIRANI R. Regression shrinkage and selection via the lasso[J]. Journal of the Royal Statistical Society, Series B, 1996,58 : 267-288.
  • 2ZOU HUI. The adaptive lasso and its oracle properties[J]. Journal of the American Statistical Association, 2006,101 : 1418-1429.
  • 3FAN J, LI R. Variable selection via nonconcave penalized likelihood and its oracle properties. Journal of the A- merican Statistical Association,2001,96:1348-1360.
  • 4ZOU H, HASTIE T. Regularization and variable selection via the elastic net[J]. Journal of the Royal Statistical Society, Series B, 2005,67 : 301-320.
  • 5ZOU H,ZHANG H. On the adaptive elastic net with a diverging number of parameters[J]. The Annals of Statistics, 2009,37 : 1733-1751.
  • 6WANG H,LI B, LENG C. Shrinkage turning parameter selection with a diverging number of parameters[J]. Journal of the Royal Statistical Society,Series B,2009,71:671-683.
  • 7CHEN J,CHEN Z. Extended bayesian information criteria for model selection with large model spaces[J]. Bi- ometrika, 2008,95 : 759-771.
  • 8WANG T,ZHU L. Consistent tuning parameter selection in high sparse linear regression[J]. Journal of Multi- variate Analysis,2011,102: 1141-1151.
  • 9NADEAU C, BENGIO Y. Inference for the generalization error[J]. Machine Learning, 2003,52: 3,239-281.
  • 10BENGIO Y,GRANDVALET Y. No unbiased estimator of the variance of K-fold cross-validation[J]. Journal of Machine Learning Research,2004,5:1089-1105.

二级参考文献12

共引文献44

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部