摘要
传统K最近邻(KNN)法在进行样本分类时容易产生无法判断或判断错误的问题。为此,将特征熵与KNN相结合,提出一种新的分类算法(FECD-KNN)。该算法采用熵作为类相关度,以其差异值计算样本距离。用熵理论规约分类相关度,并根据相关度间的差异值衡量特征对分类的影响程度,从而建立距离测度与类别间的内在联系。仿真实验结果表明,与KNN及Entropy-KNN法相比,FECD-KNN在保持效率的情况下,能够提高分类准确性。
The paper ameliorates the method that combined K-Nearest Neighbor(KNN) with entropy,a new improved algorithm that adopting entropy as correlation and taking differences values to calculate distance is proposed,which calls FECD-KNN,based on the research that KNN tested sample in misjudgment and error easily.The impacted algorithm combines information entropy theory used to statute correlation,measures strength of impact on the classification according to difference of correlation,and establishes the intrinsic relation between the distance and class.The contrast simulation experiment shows that,compared with KNN and Entropy-KNN,the impacted algorithm adopting the degree of correlation to optimize distance raised the rate of accuracy enormously in classification,meanwhile it also maintains efficiency of classification.
出处
《计算机工程》
CAS
CSCD
北大核心
2011年第17期146-148,共3页
Computer Engineering
关键词
K最近邻算法
熵
相关度
差异
K-Nearest Neighbor(KNN) algorithm
entropy
correlation
difference