期刊文献+

基于相关信息熵和粒子群算法的特征选择方法 被引量:1

Feature Selection Based on Correlation Information Entropy and Particle Swarm Optimization
下载PDF
导出
摘要 为了在特征选择中获得具有较高分类准确率的特征子集,考虑特征之间、特征与类别的相关性对于其类间区分能力的影响,提出了一种基于相关信息熵的特征排序消除法和二进制粒子群算法相结合的特征选择算法。本算法先采用相关信息熵特征排序算法,对数据集进行特征重要度排序,选取SVM作为分类器,快速去掉部分无关特征,初步缩减数据纬度,然后以改进的粒子群算法继续搜索最优子集,将排序算法得到的优良子作为粒子群算法的部分初始种群,使后续粒子群算法有一个较好的搜索起点。实验结果表明,采用该方法筛选出来的特征自己具有更好的分类性能,并且有效降低了系统的检测时间。 To obtain a subset of features with higher classification accuracy,consider the influence of the correlation between the features and the classes or features on the classification,a feature selection method based on correlation information entropy and Binary Particle Swarm Optimization algorithm was proposed.Firstly,this method filtrated irrelevant features to reduce the feature dimension respectively by feature sorting based on correlation information entropy,then continued to search optimal subset,and used some better subset of feature sorting based on correlation information entropy as part of initial binary particle swarm optimization population with good starting points.The simulation test results showed that the better classification performance was obtained according to the selected optimal feature subset,and the testing time of the system was reduced effectively.
作者 刘培玉 任敏 陈小雪 王燕飞 LIU Pei-yu;REN Min;CHEN Xiao-xue
出处 《信息技术与信息化》 2018年第2期88-93,共6页 Information Technology and Informatization
基金 国家自然科学基金资助项目(61373148,61502151) 教育部人文社科基金资助项目(14YJC860042) 山东省自然基金资助项目(ZR2014FL010) 山东省优秀中青年科学家奖励基金资助项目(BS2013DX033) 山东省社会科学规划项目(16CFXJ05) 山东省高等学校科技计划项目(J15LN02,J15LN22)
关键词 特征选择 相关信息熵 二进制粒子群算法 支持向量机 feature selection correlation information entropy binary particle swarm optimization support vector machine
  • 相关文献

参考文献5

二级参考文献100

  • 1刘涛,吴功宜,陈正.一种高效的用于文本聚类的无监督特征选择算法[J].计算机研究与发展,2005,42(3):381-386. 被引量:37
  • 2乔立岩,彭喜元,彭宇.基于微粒群算法和支持向量机的特征子集选择方法[J].电子学报,2006,34(3):496-498. 被引量:24
  • 3Li G-Z, Yang J Y. Feature selection for ensemble learning and its application[M]. Machine Learning in Bioinformatics, 2008: 135-155.
  • 4Sheinvald J, Byron Dom, Wayne Niblack. A modelling approach to feature selection[J]. Proc of 10th Int Conf on Pattern Recognition, 1990, 6(1): 535-539.
  • 5Cardie C. Using decision trees to improve case-based learning[C]. Proc of 10th Int Conf on Machine Learning. Amherst, 1993: 25-32.
  • 6Modrzejewski M. Feature selection using rough sets theory[C]. Proc of the European Conf on Machine ,Learning. 1993: 213-226.
  • 7Ding C, Peng H. Minimum redundancy feature selection from microarray gene expression data[J]. J of Bioinformatics and Computational Biology, 2005, 3(2): 185-205.
  • 8Francois Fleuret. Fast binary feature selection with conditional mutual information[J]. J of Machine Learning Research, 2004, 5(10): 1531-1555.
  • 9Kwak N, Choi C-H. Input feature selection by mutual information based on Parzen window[J]. IEEE Trans on Pattern Analysis and Machine Intelligence, 2002, 24(12): 1667-1671.
  • 10Novovicova J, Petr S, Michal H, et al. Conditional mutual information based feature selection for classification task[C]. Proc of the 12th Iberoamericann Congress on Pattern Recognition. Valparaiso, 2007: 417-426.

共引文献342

同被引文献13

引证文献1

二级引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部