期刊文献+

结合类别信念的AdaBoost算法(英文)

AdaBoost Algorithm with Classification Belief
下载PDF
导出
摘要 集成学习是一种受到广泛认可和使用的机器学习算法.为此提出一种新的多类集成学习算法,即AdaBoost belief.此算法改进多类集成学习算法AdaBoost·SAMME,使每个基分类器对于每个类别都有权重信息.这种类别上的权重被称为类别信念,可通过计算每次迭代中各个类别的正确率得到.将所提出的算法与原有的AdaBoost·SAMME算法从预测准确率、泛化能力以及理论支持等方面进行比较发现:在高斯数据集、多种UCI数据集以及基于日志的多类别入侵检测应用中,该算法不但具有更高的预测准确率和泛化能力,而且当类别数目增加,即类别更难以预测时,其分类错误率较原有AdaBoost·SAMME算法上升得更缓慢. Ensemble learning is widely accepted and used in machine learning. This paper proposes a multi-class ensemble learning algorithm named AdaBoost belief. The algorithm improves AdaBoost.SAMME by attaching weights to classes in every weak classifier. These weights, called class beliefs, are computed based on class accuracy collected in each round of the iteration. We compare the algorithm with AdaBoost.SAMME in many aspects in- cluding learning accuracy, generalization ability, and theory support. Experimental results indicate that the proposed method has a competitive learning ability and high prediction accuracy in Gaussian sets, several UCI sets, anda number of log-based intrusion detec- tion applications. When the class number increases so that prediction of classes becomes more difficult, the prediction error rate of the proposed algorithm increases slower than AdaBoost.SAMME.
出处 《应用科学学报》 CAS CSCD 北大核心 2015年第2期203-214,共12页 Journal of Applied Sciences
基金 Supported by the National Science Foundation of China(No.61103067)
关键词 集成学习 多类别 类别信念 类别权重 AdaBoost·SAMME ensemble learning, multi-class, class belief, class weight, AdaBoost.SAMME
  • 相关文献

参考文献20

  • 1Ensemble learningscholarpediahttp://www.scholarpedia.org/article/Ensemble_learning.
  • 2WANG X, MATWIN S, JAPKOWICZ N, LIU X. Cost-sensitive boosting algorithms for imbalanced multi-instance datasets [J]. Advances in Artificial Intelligence, 2013, 7884: 174-186.
  • 3SUN Y, KAMEL MS, WONG AK, WANG Y. Cost-sensitive boosting for classification of imbalanced data [J]. Pattern Recognition, 2007, 40: 3358-3378.
  • 4YUAN B, MA XL. Sampling + reweighting: boosting the performance of AdaBoost on imbalanced datasets [C]// Proceedings of the 2012 International Joint Conference on Neural Networks (IJCNN), 2012: 2680-2685.
  • 5SCHAPIRE R, SINGERY. (1999). Improved boosting algorithmsusing confidence-rated prediction. Machine Learning, 1999: 37297-336.
  • 6SCHAPIRE R. Using output codes to boost multiclasslearning problems [C]//Proceedings of the Fourteenth International Conference on Machine Learning. Morgan Kauffman,1997.
  • 7FRIEDMAN J, HASTIE T, TIBSHIRANI R. Additive logistic regression: a statistical view of boosting [J]. Annals of Statistics, 2000, 28: 337-407.
  • 8ROSSET S, ZHU Ji, ZOU Hui, HASTIE T. Multi-class AdaBoost [J]. Statistics and Its Interface, 2009, 2: 349-360.
  • 9CHAWLA N V, LAZAREVIC A, HALL L O, BOWYERK W. SMOTE Boost: improving prediction of the minority class in boosting [C]//Proceedings of Principles of Knowledge Discovery in Databases, 2003: 107-119.
  • 10LIU XY, WU J, ZHOU ZH. Exploratory under-sampling forclass-imbalance learning [J]. IEEE Transactions on Systems, Man andCybernetics - Part B, 2009, 39(2): 539 -550.

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部