期刊文献+

基于特征有效范围的前向特征选择及融合分类算法 被引量:8

Algorithm of Forward Feature Selection and Aggregation of Classifiers Based on Feature Effective Range
下载PDF
导出
摘要 特征在不同类别样本之间的重叠区域反映了特征的区分能力.根据特征在各类样本中的有效范围及每一区域样本的分布密度,提出一种基于特征有效范围的前向特征选择及融合分类算法(FFS-ER).该算法采用前向特征搜索策略,在进行特征选择的过程中建立分类模型.为说明该算法的有效性,在8个公共数据集上将其与比较流行的、性能优越的后向特征选择算法SVM-RFE和前向特征选择算法FIM进行比较,实验结果表明该算法所选特征构建的分类模型的分类准确率明显高于FIM算法,且在大多数情况下优于SVM-RFE算法.同时标准偏差的比较说明该算法相对于SVM-RFE和FIM具有较好的稳定性. Overlapping area of a feature among different groups reflects its discriminative ability. This paper proposes an algorithm ( FFS-ER ) of forward feature selection and aggregation of classifiers based on the effective ranges of features and the distribution den- sity of different group samples. It adopts the forward feature search strategy, and the classification model is established in the process of feature selection. In order to illustrate the effectiveness of the proposed algorithm, it is compared on eight public datasets with SVM- RFE, which is a popular and superior backward feature selection algorithm, and FIM, which is a forward feature selection algorithm. Experimental results show that the feature subsets selected by FFS-ER are more discriminative than those selected by FIM and better than those by SVM-RFE in most cases. Further the comparison on standard deviation implies that FFS-ER is more stable than SVM- RFE and FIM.
出处 《小型微型计算机系统》 CSCD 北大核心 2016年第6期1159-1163,共5页 Journal of Chinese Computer Systems
基金 国家自然科学基金项目(21375011)资助 中德联合研究中心项目(GZ753)资助
关键词 有效范围 样本分布 特征选择 融合分类器 effective range sample distribution feature selection aggregation of classifiers
  • 相关文献

参考文献3

二级参考文献68

  • 1李伟红,龚卫国,陈伟民,梁毅雄,尹克重.基于SVM RFE的人脸特征选择方法[J].光电工程,2006,33(5):113-117. 被引量:4
  • 2Li G-Z, Yang J Y. Feature selection for ensemble learning and its application[M]. Machine Learning in Bioinformatics, 2008: 135-155.
  • 3Sheinvald J, Byron Dom, Wayne Niblack. A modelling approach to feature selection[J]. Proc of 10th Int Conf on Pattern Recognition, 1990, 6(1): 535-539.
  • 4Cardie C. Using decision trees to improve case-based learning[C]. Proc of 10th Int Conf on Machine Learning. Amherst, 1993: 25-32.
  • 5Modrzejewski M. Feature selection using rough sets theory[C]. Proc of the European Conf on Machine ,Learning. 1993: 213-226.
  • 6Ding C, Peng H. Minimum redundancy feature selection from microarray gene expression data[J]. J of Bioinformatics and Computational Biology, 2005, 3(2): 185-205.
  • 7Francois Fleuret. Fast binary feature selection with conditional mutual information[J]. J of Machine Learning Research, 2004, 5(10): 1531-1555.
  • 8Kwak N, Choi C-H. Input feature selection by mutual information based on Parzen window[J]. IEEE Trans on Pattern Analysis and Machine Intelligence, 2002, 24(12): 1667-1671.
  • 9Novovicova J, Petr S, Michal H, et al. Conditional mutual information based feature selection for classification task[C]. Proc of the 12th Iberoamericann Congress on Pattern Recognition. Valparaiso, 2007: 417-426.
  • 10Qu G, Hariri S, Yousif M. A new dependency and correlation analysis for features[J]. IEEE Trans on Knowledge and Data Engineering, 2005, 17(9): 1199- 1207.

共引文献210

同被引文献46

引证文献8

二级引证文献27

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部