摘要
提出了基于动态权重裁剪的快速Adaboost训练算法,当训练数据集较大时,可以大大提高训练速度.基于动态权重裁剪的Adaboost训练算法在每次迭代过程中舍去权重较小的大多数样本,保留权重较大的少数样本进行训练,迭代完成后检查这个利用少量样本训练得到的弱分类器在所有样本上的分类性能,如果错误率大于0.5,则扩大样本的数量重新训练本次迭代的弱分类器.由于在大多数迭代过程中,只利用了少量样本进行弱分类器的训练,从而提高了整个算法的训练速度.
This paper presents a novel fast Adaboost training algorithm by dynamic weight trimming, which increases the training speed greatly when dealing with large datasets. At each iteration, the algorithm discards most of the samples with small weight and keeps only the samples with large weight to train the weak classifier. Then it checks the performance of the weak classifier on all the samples, if the weighted error is above 0. 5, it will increase the number of training samples and retrain the weak classifier. During training, only a small portion of the samples are used to train the weak classifier, so the speed is increased greatly.
出处
《计算机学报》
EI
CSCD
北大核心
2009年第2期336-341,共6页
Chinese Journal of Computers
基金
国家自然科学基金(60872084)
教育部高等学校博士学科点专项科研基金(20060003102)的资助