期刊文献+

COMBINING FEATURE SCALING ESTIMATION WITH SVM CLASSIFIER DESIGN USING GA APPROACH 被引量:2

COMBINING FEATURE SCALING ESTIMATION WITH SVM CLASSIFIER DESIGN USING GA APPROACH
下载PDF
导出
摘要 This letter adopts a GA (Genetic Algorithm) approach to assist in learning scaling of features that are most favorable to SVM (Support Vector Machines) classifier, which is named as GA-SVM. The relevant coefficients of various features to the classification task, measured by real-valued scaling, are estimated efficiently by using GA. And GA exploits heavy-bias operator to promote sparsity in the scaling of features. There are many potential benefits of this method:Feature selection is performed by eliminating irrelevant features whose scaling is zero, an SVM classifier that has enhanced generalization ability can be learned simultaneously. Experimental comparisons using original SVM and GA-SVM demonstrate both economical feature selection and excellent classification accuracy on junk e-mail recognition problem and Internet ad recognition problem. The experimental results show that comparing with original SVM classifier, the number of support vector decreases significantly and better classification results are achieved based on GA-SVM. It also demonstrates that GA can provide a simple, general, and powerful framework for tuning parameters in optimal problem, which directly improves the recognition performance and recognition rate of SVM. This letter adopts a GA (Genetic Algorithm) approach to assist in learning scaling of features that are most favorable to SVM (Support Vector Machines) classifier, which is named as GA-SVM. The relevant coefficients of various features to the classification task, measured by real-valued scaling, are estimated efficiently by using GA. And GA exploits heavy-bias operator to promote sparsity in the scaling of features. There are many potential benefits of this method: Feature selection is performed by eliminating irrelevant features whose scaling is zero, an SVM classifier that has enhanced generalization ability can be learned simultaneously. Experimental comparisons using original SVM and GA-SVM demonstrate both economical feature selection and excellent classification accuracy on junk e-mail recognition problem and Internet ad recognition problem. The experimental results show that comparing with original SVM classifier, the number of support vector decreases significantly and better classification results are achieved based on GA-SVM. It also demonstrates that GA can provide a simple, general, and powerful framework for tuning parameters in optimal problem, which directly improves the recognition performance and recognition rate of SVM.
出处 《Journal of Electronics(China)》 2005年第5期550-557,共8页 电子科学学刊(英文版)
基金 Supported by the National Natural Science Foundation of China (No.60175020) the National High Tech Development '863' Program of China (No.2002AA117010-09).
关键词 支撑向量 遗传算法 面容特征 选择性 算子 Support Vector Machines (SVM) Genetic Algorithm (GA) Feature scaling Feature selection Zero-bias operator
  • 相关文献

参考文献2

  • 1Olivier Chapelle,Vladimir Vapnik,Olivier Bousquet,Sayan Mukherjee.Choosing Multiple Parameters for Support Vector Machines[J].Machine Learning (-).2002(1-3)
  • 2Corinna Cortes,Vladimir Vapnik.Support-Vector Networks[J].Machine Learning.1995(3)

同被引文献6

引证文献2

二级引证文献32

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部