期刊文献+

基于Markov blanket和互信息的集成特征选择算法 被引量:7

Ensemble feature selection algorithm based on Markov blanket and mutual information
下载PDF
导出
摘要 针对大量无关和冗余特征的存在可能降低分类器性能的问题,提出一种基于近似Markov blanket和动态互信息的特征选择算法并将其应用于集成学习,进而得到一种集成特征选择算法。该集成特征选择算法运用Bagging方法结合提出的特征选择方法生成基分类器,并引入基分类器差异度进行选择性集成,最后用加权投票法融合所选基分类器的识别结果。通过仿真实验验证算法的有效性,以支持向量机(support vector machine,SVM)为分类器,在公共数据集UCI上进行试验,并与单SVM及经典的Bagging集成算法和特征Bagging集成算法进行对比。实验结果显示,该方法可获得较高的分类精度。 To resolve the poor performance of classifiers owing to the irrelevant and redundancy features,a feature selection algorithm based on approximate Markov blanket and dynamic mutual information is proposed,then it is introduced to an ensemble feature selection algorithm.In the ensemble algorithm,a base classifier is trained based on Bagging and the proposed feature selection algorithm,and the base classifier diversity is introduced to selective ensemble.Finally,the weighted voting method is utilized to fuse the base classifiers' recognition results.To attest the validity,experiments on data sets with support vector machine(SVM) as the classifier are carried out.The results have been compared with single-SVM,Bagging-SVM and AB-SVM.Experimental results suggest that the proposed algorithm can get higher classification accuracy.
出处 《系统工程与电子技术》 EI CSCD 北大核心 2012年第5期1046-1050,共5页 Systems Engineering and Electronics
基金 国家自然科学基金(60975026)资助课题
关键词 特征选择 集成 MARKOV BLANKET 互信息 feature selection ensemble Markov blanket mutual information
  • 相关文献

参考文献15

  • 1Opitz D. Feature selection for Ensembles[C]//Proc. of the American Association for Artificial Intelligence, 1999:379 - 384.
  • 2Ho T K. The random subspace method for constructing decision forests[J]. IEEE Trans. on Pattern Analysis and Machine In telligence, 1998, 20(8) : 832 - 844.
  • 3Opitz D, Shavlik J, A genetic algorithm approach for creating neural network ensembles[C]//Proc, of the Combining Artifi- cial ,Neural Nets, 1999:79 - 97.
  • 4Oliveira I. S, Morita M, Sabourin R, et al. Multi-objective ge- netic algorithms to create ensemble of elassifiers[C]//Proc, of the 3rd International Conference on Evolutionary Multi-Crite rion Optimization, 2005 : 592 - 606.
  • 5Brylla R, Gutierrez O R, Queka F. Attribute Bagging: improv- ing accuracy of classifier ensembles by using random feature sub- sets[J]. Pattern Recognition, 2003, 36(6): 1291-1302.
  • 6Koller D, Sahami M. Toward optimal feature selection[C]// Proc. of the International Conference on Machine Learning, 1996: 284-292.
  • 7Pearl J. Probabilistic reasoning in intelligent systems[ M]. Washing- toni American Sociological Association, 1988 :449- 484.
  • 8Sylvain V, Teodor T, Abdessamad K. Fault detection and iden- tification with a new feature selection based on mutual informa tion[J]. Journal of Process Control, 2008, 18(5) : 479 - 490.
  • 9Guo B F, Mark S N. Gait feature subset selection by mutual in- formation[J]. IEEE Trans. on Systems, Man and Cybernet ics PartA: Systemsand Humans, 2009, 39(1): 36-46.
  • 10Estevez P A, Michel T, Perez C A, et al. Normalized mutual information feature selection[J]. IEEE Trans. on Neural Net works, 2009, 20(2) : 189 -201.

同被引文献94

引证文献7

二级引证文献44

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部