期刊文献+

一种小样本数据的特征选择方法 被引量:20

A Feature Selection Method for Small Samples
下载PDF
导出
摘要 小样本数据由于其特征维数相对于样本数目较多,且常包含不相关或冗余特征,使得常用的机器学习算法处理小样本数据时无法得到好的效果,通过特征选择来降低数据维数是解决该问题的一种有效途径.针对小样本数据,提出一种基于互信息的过滤型特征选择方法,首先定义了基于互信息的特征分组标准,该标准同时考虑特征与类别的相关性和不同特征之间的冗余性,根据该标准对特征分组后,在各组内选出与类别相关性最大的特征构成候选特征子集,保证了算法具有较低的时间复杂度,之后采用Boruta算法,在候选特征子集中自动确定最佳特征子集,从而大幅度降低数据的维数.通过与5种经典的特征选择算法比较,在标准数据集上采用3种分类器的实验结果表明提出的方法选出的特征子集具有较好的运行效率和分类性能. For small samples,the common machine learning algorithms may not obtain good results as the feature dimension of small samples is often larger than the number of samples and some irrelevant or redundant features are often existed.It is an effective way to solve this problem by reducing the feature dimension through feature selection.This paper proposes a filter feature selection method based on mutual information for the small samples.First,the criterion of feature grouping based on the mutual information is defined.Both the correlations between features and the class and the redundancy among different features are considered in this criterion,according to which the features are grouped.Then those features that have maximal correlation with the class in each group will be chosen to compose a candidate feature subset.Meanwhile,it is ensured that the time complexity of this algorithm is low.After that,the feature selection method based on feature grouping is combined with Boruta algorithm to determine the optimal feature subset automatically from the candidate feature subset.In this way,the feature dimension can be reduced greatly.Compared with the five classical feature selection algorithms,experimental results on benchmark data sets demonstrate that the feature subset selected by the proposed method has better classification performance and running efficiency on three kinds of classifiers.
作者 许行 张凯 王文剑 Xu Hang;Zhang Kai;Wang Wenjian(School of Computer and Information Technology,Shanxi University,Taiyuan 030006;Key Laboratory of Computational Intelligence and Chinese Information Processing(Shanxi University),Ministry of Education,Taiyuan 030006)
出处 《计算机研究与发展》 EI CSCD 北大核心 2018年第10期2321-2330,共10页 Journal of Computer Research and Development
基金 国家自然科学基金项目(61673249) 山西省回国留学人员科研基金项目(2016-004) 赛尔网络下一代互联网技术创新项目(NGII20170601)~~
关键词 小样本数据 特征选择 互信息 特征分组 过滤型算法 small samples feature selection mutual information feature grouping filter algorithm
  • 相关文献

参考文献2

二级参考文献33

  • 1刘涛,吴功宜,陈正.一种高效的用于文本聚类的无监督特征选择算法[J].计算机研究与发展,2005,42(3):381-386. 被引量:37
  • 2Wang Lei.Feature Selection with Kernel Class Separability.IEEE Trans on Pattern Analysis and Machine Intelligence,2008,30 (9):1534-1546.
  • 3Liu Huan,Yu Lei.Toward Integrating Feature Selection Algorithms for Classification and Clustering.IEEE Trans on Knowledge and Data Engineering,2005,17(4):491 -502.
  • 4Webb A R.Statistical Pattern Recognition.2nd Edition.New York,USA:John Wiley & Sons,2002.
  • 5Narendra P M,Fukunaga K.A Branch and Bound Algorithm for Feature Subset Selection.IEEE Trans on Computers,1977,26 (9):917 -922.
  • 6Liu H,Motoda H.Feature Selection for Knowledge Discovery and Data Mining.Boston,USA:Kluwer Academic,1998.
  • 7Busetti F.Simulated Annealing Overview[EB/OL].[2009-05-03].http://www.geocities.com/francorbusetti/saweb.pdf.
  • 8Muller K R,Mika S,Ratsch G,et al.An Introduction to KernelBased Learning Algorithms.IEEE Trans on Neural Networks,2001,12(2):181 -201.
  • 9Weston J,Mukherjee S,Chapelle O,et al.Feature Selection for SVMs//Proc of the Annual Conference on Neural Information Processing Systems.Denver,USA,2000:668-674.
  • 10Langley P. Selection of relevant features in machine learning [C] //Proc of the AAAI Fall Symposium on Relevance. Menlo Park, CA: AAAI, 1994:1-5.

共引文献81

同被引文献169

引证文献20

二级引证文献79

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部