期刊文献+

基于遗传算法与经验误差最小化的SVM模型选择方法 被引量:1

SVM Model Selection Based on Genetic Algorithms and Empirical Error Minimization
下载PDF
导出
摘要 支持向量机(SVM)的推广能力依赖于核函数形式及核参数和惩罚因子的选取,即模型选择.在分析参数对分类器识别精度的影响基础上,提出了基于遗传算法和经验误差最小化的支持向量机参数选择方法.在13个UC I数据集上的实验表明了本文算法的正确性与有效性,且具有良好的推广性能. The spreading capacity of support vector machine (SVM) depends largely on the selection of kernel function and its parameters, and penalty factor, that is model selection. Having analyzed the parameter's influence on the classifier's recognition accuracy, we propose a new method for SVM model selection using genetic algorithm and empirical error minimization. The experiments on 13 different UCI benchmarks show its correctness, effectiveness and good spreading performance.
作者 周欣 许建华
出处 《南京师范大学学报(工程技术版)》 CAS 2009年第2期65-71,共7页 Journal of Nanjing Normal University(Engineering and Technology Edition)
基金 国家自然科学基金(60875001)资助项目
关键词 支持向量机 核函数 核参数 经验误差 遗传算法 support vector machine, kernel function, kernel parameter, empirical error, genetic algorithm
  • 相关文献

参考文献13

  • 1Ratsch G,Onoda T,Muller K R.Soft margins for AdaBoost[J].Machine Learning,2001,42:287-320.
  • 2Chapelle O,Vapnik V,Bousquet O,et al.Choosing multiple parameters for support vector machines[J].Machine Learning,2002,46:131-159.
  • 3Keerthi S S.Efficient tuning of SVM hyperparameters using radius margin bound and iterative algorithms[J].IEEE Transactions on Neural Networks,2002,13:1225-1229.
  • 4Duan K,Keerthi S S,Poo A N.Evaluation of simple performance measures for tuning SVM hyperparameters[J].Neurocomputing,2003,51:41-59.
  • 5Ayat N E,Cheriet M,Suen C Y.Optimization of the SVM kernels using an empirical error minimization scheme[C]//Lee S W,Verri A.Pattern Recognition with Support Vector Machines.Berlin Heidelberg:Springer,2002,2388:354-369.
  • 6Adankon M M,Cheriet M,Ayat N E.Optimizing resources in model selection for support vector machines[C]//2005 International Joint Conference on Neural Networks.Canada,Montreal,2005:925-930.
  • 7Ayat N E,Cheriet M,Suen C Y.Automatic model selection for the optimization of the SVM kernels[J].Pattern Recognition,2005,38:1733-1745.
  • 8Adankon M M,Cheriet M.New formulation of SVM for model selection[C]//2006 International Joint Conference on Neural Networks.Canada,Vancouver:IEEE Press,2006:1900-1907.
  • 9Zheng C H,Li C J.Automatic parameters selection for SVM based on GA[C]//5th World Congress on Intelligent Control and Automation.Hangzhou,China:IEEE Press,2004:1869-1872.
  • 10Javier A,Saturnino M,Philip S.Tuning L1-SVM hyper-parameters with modified radius margin bounds and simulated annealing[C]//Computational and Ambient Intelligence.Berlin Heidelberg:Springer-Verlag,2007,4507:284-291.

同被引文献12

引证文献1

二级引证文献3

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部