摘要
AdaBoost是机器学习中比较流行的分类算法。通过研究弱分类器的特性,提出了两种新的弱分类器的阈值和偏置计算方法,二者可以使弱分类器识别率大于50%,从而保证在弱分类器达到一定数目的情况下,AdaBoost训练收敛。对两种阈值和偏置计算方法的仿真实验结果表明,在错分率降可接受的范围内,二者均使用较少的弱分类器便可获得高识别率的强分类器。
AdaBoost is a very popular classification algorithm on machine leaning. By studying the characteristics of the weak classifier, this paper proposes two new methods to calculate the threshold and bias of the weak classifier. The two methods make the correct rate of weak classifier larger than 50% , assure the convergence of AdaBoost training when the weak classifier reach a certain number. Simulation experiments show when the error rate is in an acceptable range, the algorithms using fewer weak classifiers will be able to guarantee the strong classifier to maintain a high correct rate.
出处
《中国图象图形学报》
CSCD
北大核心
2009年第11期2411-2415,共5页
Journal of Image and Graphics
基金
上海市自然科学基金项目(08ZR1408200)
上海市重点学科建设项目(J50103)
中国科学院模式识别国家重点实验室开放课题基金(08-2-16)