摘要
针对传统Adaboost算法对有噪声样本敏感的问题以及线性相加基分类器的不合理性,提出一种噪声自检测的分段非线性组合Adaboost算法(NDK Adaboost)。NDK Adaboost利用传统Adaboost算法的训练误差率随迭代次数呈指数下降的特点直接构造检测噪声模型来识别噪声,并且在预测阶段将预测样本映射到训练样本的相对位置,根据其邻近的样本分布决定基分类器的权重,从而使算法在不同的样本分布中具有较高的分类准确率。实验结果表明,与传统Adaboost算法以及Adaboost相关的改进算法相比,该算法具有较高的分类准确率。
As traditional Adaboost algorithm is sensitive to noisy sample and the linear combination of base classifiers is irrational, a piecewise nonlinear Adaboost algorithm based on noise self-detection called NDK Adaboost is proposed. NDK Adaboost,drawing on traditional Adaboost algorithm whose error rate in training set decreases with iteration times exponentially,establishes directly a noise detection model to recognize noise, and maps the prediction samples to the relative positions of the training set. According to the neighbor samples' distribution, it determines the weight of the base classifier. A higher classification accuracy rate of the algorithm can be drawn out among the different sample distribution. Experimental results show that,compared with the traditional Adaboost algorithm and the related improved algorithms, NDK Adaboost has a higher classification accuracy rate.
出处
《计算机工程》
CAS
CSCD
北大核心
2017年第5期163-168,173,共7页
Computer Engineering