期刊文献+

基于改进粒子群算法的SVR参数优化选择的研究 被引量:5

To Study on Optimization of SVR Parameters Selection Based on IPSO
下载PDF
导出
摘要 支持向量回归机(SVR)模型的拟合精度和泛化能力取决于其相关参数的选取,由于在参数的选择范围内可选择的数量是无穷的,在多个参数中盲目搜索最优参数是需要极大的时间代价,并且很难逼近最优。因此提出了基于改进粒子群算法的SVR参数优化选择方法。仿真结果表明:该改进粒子群算法优化SVR参数方法可行、有效,由此得到的SVR模型具有更好的学习精度和推广能力。 The regression accuracy and generalization performance of the support vector regression (SVR) models depend on a proper setting of its parameters. As the parameter choice is infinite, the parameter chioce needs enormous time, and is very difficult to approach superiorly. So An optimal selection approach of SVR parameters was put forward based on improved particle swarm optimization algorithm. Simulation results show that the optimal selection approach based on IPSO is available and the IPSO-SVR model has superior learning accuracy and generalization performance.
作者 张贺 贺兴时
出处 《价值工程》 2008年第11期90-93,共4页 Value Engineering
关键词 改进粒子群算法 支持向量回归(SVR) 递减策略 improved particle swarm optimization algorithm support vector regression(SVR) decreasing strategy
  • 相关文献

参考文献8

  • 1Vapnik V. An Overview of Statistical Learning Theory [J ]. IEEE Trans on Neural Networks, 1999, 10 (5) :988-999.
  • 2许建华,张学工,李衍达.支持向量机的新发展[J].控制与决策,2004,19(5):481-484. 被引量:132
  • 3Christopher J C Burges. A Tutorial on Support Vector Machines for Pattern Recognition [J ]. Data Mining and Knowledge Discovery , 1998, 2 (2) : 121-167.
  • 4Reyna R A, Hernandez N, Esteve D,et al. Segmenting images with support vector machines.In Proceedings of 2000 International Conference on Image Processing,Vancover, BC,Canada,2000,1 (1):820- 823.
  • 5Valentini A, Zhang Hongjiang. Automatic image orientation detection.IEEE Trans on Image Processing,2002,11(7):746-755.
  • 6Vladimir Cherkassky, Yunqian Ma. Practical selection of SVM parameters and noise estimation for SVM regression[J]. Neural Networks (S0893-6080), 2004, 17(1) : 113-126.
  • 7Vapnik V. Statistical learning theory [M]. New York: Wiley, 1998.
  • 8B stun, W J Melssen. Determination of optimal support vector regression parameters by genetic algorithms and simplex optimization[J]. Analytical Chimica Acta (S0003-2670), 2005,544(1-2) : 292-305.

二级参考文献25

  • 1[1]Boser B E, Guyon I M, Vapnik V N. A training algorithm for optimal margin classifiers[A]. The 5th Annual ACM Workshop on COLT [C]. Pittsburgh:ACM Press, 1992. 144-152.
  • 2[2]Cortes C, Vapnik V N. Support vector networks[J].Machine Learning, 1995, 20(3): 273-297.
  • 3[3]Drucker H, Burges C J C, Kaufman L, et al. Support vector regression machines [A]. Advances in Neural Information Processing Systems[C]. Cambridge: MIT Press, 1997. 155-161.
  • 4[4]Vapnik V N, Golowich S, Smola A. Support vector method for function approximation, regression estimation and signal processing [A]. Advances in Neural Information Processing Systems [ C ].Cambridge: MIT Press, 1997. 281-287.
  • 5[5]Vapnik V N. The Nature of Statistical Learning Theory[M]. New York: Springer-Verlag, 1995.
  • 6[6]Vapnik V N. Statistical Learning Theory [M]. New York: Wiley, 1998.
  • 7[7]Vapnik V N. The Nature of Statistical Learning Theory [M]. 2nd edition. New York: SpringerVerlag, 1999.
  • 8[8]Platt J. Fast training of support vector machines using sequential minimal optimization [ A ]. Advances in Kernel Methods - Support Vector Learning [C].Cambridge: MIT Press, 1999. 185-208.
  • 9[9]Suykens J A K, Vandewalle J. Least squares support vector machines [J]. Neural Processing Letters, 1999, 9(3): 293-300.
  • 10[10]Scholkopf B, Smola A J, Williamson R C, et al. New support vector algorithms [J]. Neural Computation,2000, 12(5) :1207-1245.

共引文献131

同被引文献44

引证文献5

二级引证文献6

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部