期刊文献+

采用双目标优化的核参数选择方法 被引量:1

Bi-criterion optimization for kernel parameters selection
下载PDF
导出
摘要 核函数的参数选取问题是用支持向量机进行非线性学习时的一个重要问题。采用距离标准和夹角标准给出一种基于多目标的核参数确定方法:使核参数满足两类训练样本到各自的中心距离和尽可能小、而到对方类中心距离和尽可能大,并且使核矩阵和目标核矩阵之间的夹角尽可能小。试验结果表明该方法是有效的。 Parameter selection of kernel function is an issue of nonlinear study based on Support Vector Machine (SVM). Considering that the parameters of the kernel function affect the result of the nonlinear SVM greatly, two criterions are used for choosing the optimum parameter of a given kernel function. One is the dis- tance criterion that minimizes the sumsquare distance between the labeled training sample and its own center, and maximizes the sum - square distance between the training sample and the other labeled - center. The other is the angle criterion that minimizes the angle between the kernel matrix and targets matrix. The experiments show that our methods are efficient.
出处 《电光与控制》 北大核心 2007年第6期197-200,201,共5页 Electronics Optics & Control
基金 国家自然科学基金资助(60603098)
关键词 支持向量机 核函数 参数选择 双目标优化 support vector machine kernel function parameter selection bi - criterion optimization
  • 相关文献

参考文献27

  • 1VAPNIK V N. The nature of statistical learning theory[ M]. New York : Springer- Verlag,2000.
  • 2VAPNIK V N. An overview of statistical learning theory [ J]. IEEE trans, on Neural Networks, 1999,10(5) : 988-999.
  • 3OSUNA E, FREUND R, GIROSI F. An improved training algorithm for support vector machines [ C ]// Proceedings of the 1997 IEEE Workshop on Neural Networks for Signal Processing, New York : IEEE Press, 1997:276-285.
  • 4JOACHIMS T. SVM^light. [ EB/OL]. [ 1998 - 02 - 15 ]. http:// svmlight.joachims.org/.
  • 5LIN C J. On the convergence of the decomposition method for support vector machines[J]. IEEE Transactions on Neural Networks,2001,12 (6) : 1288-1298.
  • 6PLATT J C. Fast training of support vector machines using sequential minimal optimization [ M ]. Cambridge: MA, MIT Press, 1999 : 185-208.
  • 7KEERTHI S S,SHEVADE S K, BHATrACHARYYA C,et al. Improvements to Platt's SMO algorithm for SVM classifier design [ J]. Neural Computation,2001,13 : 637-649.
  • 8TAKAHASHI N, NISHI T. Rigorous proof of termination of SMO algorithm for support vector machines [ J]. IEEE Transactions on Neural Network,2005, 16(3):774-776.
  • 9MANGASARIAN O L, MUSICANT D R. Successive overrelaxation for support vector machines[J]. IEEE trans, on Neural Network, 1999,10(5) : 1032 -1037.
  • 10MANGASARIAN O L, MUSICANT D R. Lagrangian support vector machines[ J]. Journal of Machine Learning Research, 2001 ( 1 ) : 161-177.

二级参考文献56

  • 1周水生,周利华.训练支持向量机的低维Newton算法[J].系统工程与电子技术,2004,26(9):1315-1318. 被引量:9
  • 2陈开周.最优化计算方法[M].西安:西安电子科技大学出版社,1984.67-87.
  • 3Vapnik V N. The Nature of Statistical Learning Theory[M]. NY:Springer-Verlag, 1995. 北京:清华大学出版社,2000.
  • 4Vapnik V N. An Overview of Statistical Theory[J]. IEEE Trans. on Neural Network, 1999, 10(5):988-999.
  • 5Joachims T. Making Large-Scale SVM Learning Practical. Advances in Kernel Method-Support Vector Learning[M]. In B. Scholkopf et al.(ed). Advances in Kernel Method-Support Vector Learning, Cambridge, MA, MIT Press, 1999.
  • 6Platt J C. Fast Training of Support Vector Machines Using Sequential Minimal Optimization[M]. In B. Scholkopf et al. (ed), Advances in Kernel Method-Support Vector Learning, Cambridge, MA, MIT Press,1999. 185 - 208.
  • 7Mangasarian O L, Musicant D R. Successive Overrelaxation for Support Vector Machines[J]. IEEE Trans. on Neural Network, 1999,10(5):1032- 1037.
  • 8.[M].北京:清华大学出版社,2000..
  • 9张学工译.统计学习理论的本质[M].北京:清华大学出版社,2000..
  • 10Vapnik V.N.. The Nature of Statistical Learning Theory. NY: Springer-Verlag, 2000.

共引文献11

同被引文献8

引证文献1

二级引证文献3

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部