期刊文献+

一种基于邻域距离的分类方法研究

Research on neighborhood distance based classification method
下载PDF
导出
摘要 邻域粗糙集模型中,随着邻域半径的增长,基于多数原则的邻域分类器容易对未知样本的类别产生误判。为缓解该问题,在邻域分类器的基础上,采用了最小平均距离的思想,设计了一种基于邻域距离的分类器,即邻域距离分类器。邻域距离分类器通过邻域粗糙集模型识别出待测样本的邻域空间,然后采用最小平均距离的判别方式来代替多数投票原则,最后找出邻域空间内与待测样本有最小平均距离的类别作为预测的类别标记。在6组UCI数据集上的实验结果表明:1)与邻域分类器相比,所提邻域距离分类器在较大的邻域半径下获得了较为满意的分类结果;2)在进行属性约简之后,与邻域分类器相比,邻域距离分类器依然能在较大的邻域半径下获得较高的分类精度。 In the neighborhood rough set model,with the increasing of the size of information granules,the majority voting rule based neighborhood classifier is easy to misjudge the classes of unknown samples.To remedy this deficiency,based on the idea of the minimum average distance,we proposed a neighborhood distance based classification method,namely, the neighborhood distance classifier.Neighborhood distance classifier firstly finds the neighborhood space of unknown sample with neighborhood rough set model,and then instead of the majority voting rule in the neighborhood space,neighborhood distance classifier judges the class of unknown sample with the minimum average distance.Experimental results on 6 UCI data sets show that compared with neighborhood classifier,the proposed neighborhood distance classifier achieves satisfactory performance in larger information granules. After attribute reduction,compared with neighborhood classifier,neighborhood distance classifier still can achieve higher classification accuracy in larger information granules.
作者 王怡博 文辉祥 窦慧莉 WANG Yi-bo;WEN Hui-xiang;DOU Hui-li(School of Computer Science and Engineering,Southeast University,Nanjing 211189,China;School of Computer,Jiangsu University of Science and Technology,Zhenjiang 212003,China)
出处 《电子设计工程》 2019年第4期21-24,29,共5页 Electronic Design Engineering
基金 国家自然科学基金(61572242 61502211 61503160)
关键词 分类 距离 属性约简 邻域粗糙集 邻域分类器 classification distance attribute reduction neighborhood rough set neighborhood classifier
  • 相关文献

参考文献7

二级参考文献87

  • 1刘金福,于达仁,胡清华,王伟.基于加权粗糙集的代价敏感故障诊断方法[J].中国电机工程学报,2007,27(23):93-99. 被引量:14
  • 2R Gilad-Bachrach Ranb,A Navot,N Tishby.Margin based feature selection-theory and algorithms[A].International Conference on Machine Learning[C].ACM Press,2004.43-50.
  • 3T G Dietterich.An experimental comparison of three methods for constructing ensembles of decision trees:bagging,boosting,and randomization[J].Mach Learn,2000,40:139-157.
  • 4Zhou Z-H,Yu Y.Ensembling local learners through multimodal perturbation[J].IEEE Trans.SMC-Part B:Cybernetics.2005,35:725-735.
  • 5L Breiman.Bagging predictors[J].Machine Learning.1996,24(2):123-140.
  • 6RE Schapire.The strength of weak leamability[J].Machine Learning,1990,5(2):197-227.
  • 7C Blake,C J Merz.UCI repository of machine learning databases[DB/ OL].http://www.ics.uci.edu/mlearn/MLRepository.html,Department of ICS,University of California,Irvine,1998.
  • 8Hu Q H,Yu D R,Xie Z X,Li X D.EROS:ensemble rough subspaces[J].Pattern Recognition.2007,40:3728-3739.
  • 9Hu Q H,Yu D R,Xie Z.Neighborhood classifiers[J].Expert Systems with Applications,2008,34:866-876.
  • 10Hu Q H,Yu D R,Liu J F,Wu C X.Neighborhood rough set based hetsrogeneous feature subset selection[J].Information Sciences,2008,178:3577-3594.

共引文献161

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部