期刊文献+

一种并行的改进K-近邻分类方法

A Parallel Improved K-Nearest Neighbor Classification Method
下载PDF
导出
摘要 针对传统K-近邻(K-Nearest Neighbor,K-NN)分类方法不能高校处理大规模训练数据的分类问题,该文提出一种并行的改进K-NN(Improved Parallel K-Nearest Neighbor,IPK-NN)分类方法。该方法首先将大规模训练样本随机划分为多个独立同分布的工作集,对于任意一个新来的待检测样本,在每个工作集上采用标准K-NN方法对该样本进行标记,然后综合各训练集的标记结果,得到该样本的最终标记。实验结果表明,在大规模数据集的分类问题中,IPK-NN方法能够在保持较高分类精度的同时提高模型的学习效率。 To solve problems that traditional K-nearest neighbor(K-NN) classification algorithm can not solve the large scale training dataset classification problem,this paper presents a parallel improved K-nearest neighbor method,called IPK-NN classification algorithm.At first,the large scale training samples set is divided into some working sets with independent identical distribution.Then for a new query sample,by the traditional K-NN method,the label of it is obtained by every working set.Lastly,combining the label of every working set,the last optimal label of this query sample is obtained.Simulation results demonstrate that by this IPK-NN algorithm,the excellent classification efficiency is obtained with the high classification accuracy.
作者 邱强
出处 《电脑知识与技术》 2014年第4X期2825-2827,2844,共4页 Computer Knowledge and Technology
关键词 K-近邻分类 并行计算 并行K-近邻分类 工作集 K-nearest neighbor parallel computing parallel K-NN classification working set
  • 相关文献

参考文献7

二级参考文献29

共引文献36

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部