期刊文献+

基于离散化的支持向量机特征选择 被引量:4

Discretization Based Feature Selection for Support Vector Machines
下载PDF
导出
摘要 基于Nguyen的粗糙集和布尔推理离散化方法提出一种支持向量机特征选择算法,引入粗糙集的一致度指标控制离散化过程的信息损失,从而删除不相关与冗余的属性,而保留支持向量机所需分类信息。实验结果表明,所提算法提高了SVM分类器的预测精度,缩短了训练时间。 This paper presents a feature selection algorithm for support vector machine based on the rough sets and Boolean reasoning approach put forward by Nguyen. The level of consistency, coined from the rough sets theory, is introduced to measure the information loss during discretization so that irrelative or redundant attributes are eliminated while the necessary information for classification is preserved. Experiment results show that the presented algorithm can improve the prediction accuracy and reduce the training time of support vector machine.
出处 《计算机工程》 EI CAS CSCD 北大核心 2006年第11期16-17,21,共3页 Computer Engineering
基金 国家"973"计划基金资助项目(2002cb312200-01-1) 国家自然科学基金资助项目(60274032)
关键词 离散化 特征选择 支持向量机 分类 一致度 Discretization Feature selection Support vector machine Classification Consistency
  • 相关文献

参考文献9

  • 1Liu Huan, Setiono R. Feature Selection via Discretization [J]. IEEE Transaction on Knowledge and Data Engineering, 1997, 9(4).
  • 2Nguyen S H, Skowron A. Quantization of Real Value Attributes[C].Proc. of the Second Joint Annual Conference on Information Sciences,Wrightsville Beach, North Carolina, USA, 1995.
  • 3Pawlak Z. Rough Sets-Theoretical Aspects of Reasoning About Data [M]. Dordrecht: Kluwer Academic Publishers, 1991.
  • 4Nguyen S H. Some Efficient Algorithms for Rough Set Methods[C].Proceedings of the Conference of Information Processing and Management of Uncertainty in Knowledge-based Systems, Granada,Spain, 1996-07: 1451-1456.
  • 5Hettich S, Bay S D. The UCI KDD Archive[DB/OL].http://kdd.ics.uci.edu/, 1999.
  • 6King R D. Statlog Databases. Department of Statistics and Modelling Science[DB/OL]. http://www.liacc.up.pt/ML/statlog/datasets.html,1992.
  • 7Chang Chihchung, Lin Chinjen. LIBSVM: A Library for Support Vector Machines. http://www.csie.ntu.edu.tw/-cjlin/libsvm, 2001.
  • 8Boser B, Guyon I, Vapnik V. A Training Algorithm for Optimal Margin Classifiers[C]. Proceedings of the Fifth Annual Workshop on Computational Learning Theory, 1992.
  • 9Ventura D, Martinez T R. An Empirical Comparison of Discretization Methods[C]. Proceedings of the Tenth International Symposium on Computer and Information Sciences, 1995:443-450

同被引文献30

  • 1张国英,沙芸,余有明,刘玉树.基于属性相似度的云分类器[J].北京理工大学学报,2005,25(6):499-503. 被引量:11
  • 2张选平,杜玉平,秦国强,覃征.一种动态改变惯性权的自适应粒子群算法[J].西安交通大学学报,2005,39(10):1039-1042. 被引量:138
  • 3HAN JIAWEI, KAMBER M. Data mining: concepts and techniques[ M]. 2nd ed. Beijing: China Machine Press, 2006.
  • 4BLUM A L, LANGLEY P. Selection of the relevant features andexamples in machine learning [J]. Artifical Intelligence, 1997, 97:245-271.
  • 5KUDO M, SKLANSKY J. Comparison of algorithms that select features for pattern classifiers[ J]. Pattern Recognition, 2000, 33( 1 ) :25-41.
  • 6VAPNIK V N. The nature of statistical learning theory [ M ]. New York: Springer Vedag, 2000.
  • 7HETHCH S, BAY S D. The UCI KDD archive [ DB/ OL ]. [ 2009-04-08 ]. http ://kdd. ics. uci. edu.
  • 8KING R D. Statlog databases [ DB/OL ]. [ 2009-08-09 ]. http ://www. 1 lace. up. pt./ML/statlog/datasetsmtml.
  • 9Pannagadatta S K,Bhattacharyya C.Second order cone program- ming approaches for handling missing and uncertain data [J]. Journal of Machine Leaming Research,2006,7:1283-1314.
  • 10Zhang Bin, Srihari S N. Fast k-nearest neighbor classification using cluster-based trees[J].IEEE Transactions on Pattem Analy- sis and Machine Intelligence,2004,26(4):525-528.

引证文献4

二级引证文献7

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部