期刊文献+

Adaboost算法分类器设计及其应用 被引量:13

Design and Application of Adaboost Algorithm Classifier
下载PDF
导出
摘要 Adaboost算法可以将分类效果一般的弱分类器提升为分类效果理想的强分类器,而且不需要预先知道弱分类器的错误率上限,这样就可以应用很多分类效果不稳定的算法来作为Adaboost算法的弱分类器。由于BP神经网络算法自身存在的局限性和对训练样本进行选择的主观性,其分类精度以及扩展性有待提高。将Adaboost算法与BP神经网络相结合,使用神经网络分类模型作为Adaboost算法的弱分类器。算法在matlab中实现。对2个UCI的分类实验数据集进行实验,结果表明Adaboost能有效改善BP神经网络的不足,提高分类正确率和泛化率。 Adaboost algorithm can promote a weak classifier to a strong classifier without knowing the error rate upper limit of the weak classifier in advance, so a lot of classifiers which are not so stable can be used as weak classifiers in Ada- boost algorithm. Because of the limitation and subjectivity in training samples selection of the BP neural network algorithm, its classification accuracy and scalability need to be improved. So the Adaboost algorithm is combined with BP neural net- work, in which the neural network classification model is used as a weak classifier. Algorithm is realized in matlab, and two UCI data sets is used to do the experiment. The results show that Adaboost can effectively overcome the shortcomings of BP neural network, improve the classification accuracy and the rate of generalization
作者 许剑 张洪伟
出处 《四川理工学院学报(自然科学版)》 CAS 2014年第1期28-31,共4页 Journal of Sichuan University of Science & Engineering(Natural Science Edition)
关键词 弱分类器 强分类器 BP神经网络 ADABOOST算法 weak classifier strong classifier BP Neural Network Adaboost algorithm
  • 相关文献

参考文献10

二级参考文献48

  • 1燕继坤,郑辉,王艳,曾立君.基于可信度的投票法[J].计算机学报,2005,28(8):1308-1313. 被引量:8
  • 2武勃,黄畅,艾海舟,劳世竑.基于连续Adaboost算法的多视角人脸检测[J].计算机研究与发展,2005,42(9):1612-1621. 被引量:66
  • 3李闯,丁晓青,吴佑寿.一种改进的AdaBoost算法——AD AdaBoost[J].计算机学报,2007,30(1):103-109. 被引量:53
  • 4[8]Schapire R E,Singer Y. Using output codes to boost multiclass learning problems. In: Machine Learning: Proc. of the Fourteenth International Conference,1997. 313~321
  • 5[9]Schapire P E,Singer Y. Improved boosting algorithms using confidence-related predictions. In:Proc. of the Eleventh Annual Conf.on Computational Learning Theory, 1998.80~91
  • 6[10]Friedman J,Hastie T,Tibshirani R. Additive logistic regression: a statistical view of boosting: [Technical Report]. 1998
  • 7[11]Schapire R E,Singer Y. BoostTexer :A system for multiclass multi-label text categorization. Machine Learning, 1998
  • 8[1]Valiant L G. A theory of the learnable. Communications of the ACM 1984,27(22) :1134~1142
  • 9[2]Kearns M K, Vazirani L G. Learning Boolean formulae or finite automata is as hard as factoring: [Technical Report TR-14-88].Harvard University Aiken Computation Laboratory,Aug. 1988
  • 10[3]Kearns M J, Vazirani L G. Cryptographic limitations on learning Boolean formulae and finite automata. Journal of the Association for Computing Machinery, 1994,41 ((1) :67~95

共引文献319

同被引文献94

引证文献13

二级引证文献47

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部