期刊文献+

一种新的Boosting回归树方法 被引量:5

A New Boosting Regression Tree Method
下载PDF
导出
摘要 梯度Boosting思想在解释Boosting算法的运行机制时基于基学习器张成的空间为连续泛函空间,但是实际上在有限样本条件下形成的基学习器空间不一定是连续的。针对这一问题,从可加模型的角度出发,基于平方损失,提出一种重抽样提升回归树的新方法。该方法是一种加权的加法模型的逐步更新算法。实验结果表明,这种方法可以显著地提升一棵回归树的效果,减小预测误差,并且能得到比L2Boost算法更低的预测误差。 The basis of gradient boosting idea aimed to explain the working of boosting is that the space spaned by base learner is continuous functional space. But in practice,this space is not continuous under limited sample.To this problem,under the point of additive model view,in this study,a new resample boosting regression tree algorithm is proposed.This algorithm is a stage wise method in resample additive model.Our numerical experiments demonstrate the algorithm can improve results of a regression tree,reduce prediction errors evidently and get lower prediction error than L2Boost algorithm.
作者 宋捷 吴喜之
出处 《统计与信息论坛》 CSSCI 2010年第5期9-13,共5页 Journal of Statistics and Information
基金 教育部重点基地重大项目<空间统计学及其应用研究>(05JJD910001)
关键词 BOOSTING regression TREE 重抽样 预测误差 boosting regression tree resample prediction error
  • 相关文献

参考文献11

  • 1Breiman L.Arcing classifiers[J].Annals of Statistics,1998,26(3):801-849.
  • 2Breiman L.Prediction games and arcing algorithms[J].Neural Computation,1999(11):1493-1517.
  • 3Friedman J,Hastie T,Tibshirani R.Additive logistic regression:a statistical view of boosting[J].Annals of Statistics,2000,28(2):337-407.
  • 4Freund Y,Schapire R E.A decision-theoretic generalization of on-line learning and a application to boosting[J].Journal of Computer and System Sciences,1997,55(1):119-139.
  • 5Breiman L.Arcing the edge[R].Technical Report 486,Statistics Department,Univercisty of California at Berkeley,1997.
  • 6Drucker H.Improving regressors using boosting techniques[C].San Francisco:Morgan Kaufman.Proceedings of the 14th International Conference on Machine Learning,1997:107-115.
  • 7Avnimelech R,Intrator N.Boosting regression estimators[J].Neural Computation,1999,11(2):499-520.
  • 8Ridgeway G,Madigan D,Richardson T.Boosting Methodology for Regression Problems[C].San Francisco:Morgan Kaufman.Proceedings of Seventh Int'l Workshop on Artificial Intelligence and Statistics,1999:152-161.
  • 9Friedman J.Greedy function approximation:a gradient boosting machine[J].Annals of Statistics,1999,29(15):1189-1232.
  • 10Duffy N,Hehnbold D N.Boosting methods for regression[J].Machine Learning,2002,47(2):153-200.

同被引文献58

引证文献5

二级引证文献9

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部