期刊文献+

一种基于PSO的RBF-SVM模型优化新方法 被引量:15

New approach for optimizing model of RBF-SVM based on PSO
原文传递
导出
摘要 针对使用径向基核函数的支持向量机,采用粒子群优化方法实现模型优化.基于训练集中样本之间的最近平均距离和最远平均距离,给出参数σ的取值空间,从而减小了超参数搜索的范围,并采用对数刻度进一步提高粒子群优化方法的参数搜索效率.与遗传算法和网格法的对比实验表明,所提出的方法收敛速度更快,得出的超参数更优. For the radial basis function (RBF) kernel based support vector machines (SVM),particle swarm optimization (PSO) is employed to carry out the model optimization. The value space of the parameter σ is presented on the analysis of the mean shortest distance and mean furthest distance among the samples of the training set,thus the search region is reduced,and logarithmic scale is employed to further improve the search efficiency of PSO. Extensive experimental results on comparison with genetic algorithm and grid based approaches show that the proposed approach converges faster and produces better hyper-parameters.
出处 《控制与决策》 EI CSCD 北大核心 2010年第3期367-370,377,共5页 Control and Decision
基金 国家自然科学基金项目(60975026) 陕西省自然科学研究计划项目(2007F19)
关键词 模型优化 支持向量机 粒子群优化 搜索效率 Model optimization Support vector machine Particle swarm optimization Search efficiency
  • 相关文献

参考文献10

  • 1Dong Y L, Xia Z H, Xia Z Q. A two-level approach to choose the cost parameter in support vector machines [J]. Expert Systems with Applications, 2008, 34(2): 1366-1370.
  • 2Ayat N E, Cheriet M, Suen C Y. Automatic model selection for the optimization of SVM kernels [J]. Pattern Recognition, 2005, 38(10) : 1733-1745.
  • 3刘向东,骆斌,陈兆乾.支持向量机最优模型选择的研究[J].计算机研究与发展,2005,42(4):576-581. 被引量:48
  • 4朱家元,杨云,张恒喜,任博.支持向量机的多层动态自适应参数优化[J].控制与决策,2004,19(2):223-225. 被引量:18
  • 5郑春红,焦李成,丁爱玲.基于启发式遗传算法的SVM模型自动选择[J].控制理论与应用,2006,23(2):187-192. 被引量:18
  • 6Avci E. Selecting of the optimal feature subset and kernel parameters in digital modulation classification by using hybrid genetic algorithm-support vector machines: HGASVM [J]. Expert Systems with Applications, 2009, 36(2): 1391-1402.
  • 7Keersthi S S. Efficient tuning of SVM hyperparameters using radius/margin bound and iterative algorithms[J]. IEEE Trans on Neural Networks, 2002, 13(5) : 1225-1229.
  • 8Kennedy J, Eberhart R C. Particle swarm optimization [C]. Proc IEEE Conf on Neural Networks. Perth:Piscataway, 1995, 4: 1942-1948.
  • 9乔立岩,彭喜元,彭宇.基于微粒群算法和支持向量机的特征子集选择方法[J].电子学报,2006,34(3):496-498. 被引量:24
  • 10Clerc M, Kennedy J. The particle swarm explosion, stability, and convergence in a multidimensional complex space [ J]. IEEE Trans on Evolutionary Computation, 2002, 6(1): 58-73.

二级参考文献30

  • 1陈彬,洪家荣,王亚东.最优特征子集选择问题[J].计算机学报,1997,20(2):133-138. 被引量:96
  • 2V.N. Vapnik. The Nature of Statistical Learning Theory. New York: Springer-Verlag, 1995.
  • 3V. Cherkassky, F. Mulier. Learning from Data: Concept,Theory and Method. NY: John Viley & Sons, 1997.
  • 4O. Chapelle, V. M. Vapnik. Model selection for support vector machines. In: Proc. the 12th Conf. Neural Information Processing Systems. Cambridge, MA: MIT Press, 1999.
  • 5O. Chapelle, V. N. Vapnik, O. Bousquet, et al. Choosing multiple parameters for support vector machines. Machine Learning, 2002, 46(1): 131~159.
  • 6N. Cristianini, J. Shawe-Taylor, J. Kandola, et al. On kernel target alignment. In: Proc. Neural Information Processing Systems. Cambridge, MA: MIT Press, 2002. 367~373.
  • 7K. Tsuda, G. Ratsch, S. Mika, et al. Learning to predict the leave-one-out error of kernel based classifiers. In: Proc. 2001Int'l Conf. Artificial Neural Networks-ICANN 2001. Berlin:Springer-Verlag, 2001.
  • 8M Dash,Liu H.Feature selection for classification[J].Intelligent Data Analysis,1997,(3):131-156.
  • 9R Kohavi,G H John.Wrappers for feature subset selection[J].Artificial Intelligence,97.1997(1 ~2):273 -324.
  • 10J Kennedy,R C Eberhart.Particle swarm optimization[A].Proc IEEE Conference on Neural Networks[C].Piscataway,NJ,1995 (4).1942-1948.

共引文献99

同被引文献135

引证文献15

二级引证文献183

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部