期刊文献+

支持向量机中核参数选择的Max-Min方法 被引量:2

Max-Min Method for Kernel-Parameter Selection in SVM
下载PDF
导出
摘要 通过核函数技巧,定义了高维空间中两样本点之间的距离.引入异类距离平方阵,提出了一种新的选择SVM核参数准则,并给出算法,即max-min方法.该方法利用不同类的训练样本之间的距离,而不通过SVM标准样本训练寻求最优的(或有效的)核参数,避免了传统SVM在模型选择上经验性强和计算量大的不足.同时又分别以径向基核函数(RBF)和多项式函数为例进行试验,显示采用该方法的算法步骤.结合试验结果,得出关于核参数的选择问题一般在一个开集内只有有效值,不存在最优值,即是一个多目标优化问题的结论.并引用已有的实验结果充分支持我们的结论.max-min方法不仅在理论上提供了一种选择最优核参数的方法,而且对试验性选择具有指导作用. A distance is defined between two sample points in high-dimension space by means of kernel-function technique. A distance-square matrix of different classes is called. A new method, max-rain, using sample distance between different classes, without employing standard samples to train to find optimal or effective kernel parameter, is proposed for kernel-parameter selection in support vector machines. The method avoid the deficiencies of a large number of calculation of conventional methods which depend on experience strongly. Radial basis kernel function and polynomial kernel function are used as examples respectively to conduct experiments to show steps using the algorithm. By combining the experiment results, a conclusion is concluded that a problem of choosing kernel parameter, a problem of multi-objective optimization, does not exist optimal value but effective value in an open set. The available experiment results is cited to support sufficiently above conclusion. The max-min method not only provides theoretically a method of optimization of kernel-parameter selection but also has guide operation for the parameter choice by means of experiments.
出处 《河南科学》 2007年第3期469-472,共4页 Henan Science
基金 国家自然科学基金项目(60574075) 河南省教育厅自然科学研究项目(2004601013)
关键词 支持向量机 核函数 核参数 异类距离平方阵 support vectormachines(SVM) kemel function kernel-parameter distance-square matrix of different classes
  • 相关文献

参考文献8

  • 1VAPNIK V N.The nature of statistical learning theory[M].New York:Springer-Verlag,1995.
  • 2CRISTIANINI N,SHAWE-TAYOR J.An introduction to support vector machines and other kernel-based learning methods[M].Cambridge:Cambridge University Press,2000.
  • 3CHAPELLE O,VAPNIK V M.Model selection for support vector machines[C]//.Proc.the 12th Conf.Neural Information Processing Systems.Cambridge:MA:MIT Press,1999.
  • 4CHAPELLE O,VAPNIK V M,BOUSQUET O,et al.Choosing multiple parameters for support vector machines[J].Machine Learning,2002,46 (1):131-159.
  • 5CRISTIANINI N,SHAWE-TAYOR J,KANDOLA J,et al.On kernel target alignment[C]//.Proc.Neural Information Processing Systems.Cambridge,MA:MIT Press,2002:367-373.
  • 6TSUDA K,RATSCH G,MIKA S,et al.Learning to predict the leave-one-out error of kernel based classifiers[C]//.Proc.2001 Int'1 Conf.Artificial Neural Networks ICANN 2001.Berlin:Spinger-Verlag,2001.
  • 7刘向东,骆斌,陈兆乾.支持向量机最优模型选择的研究[J].计算机研究与发展,2005,42(4):576-581. 被引量:49
  • 8BLAKE C L,MERZ C J.UCI Repository of machine learning databases[DB/OL].[2006-05-21].http://www.ics.uci.edu/-mlearn/MLRepository.html.

二级参考文献8

  • 1V.N. Vapnik. The Nature of Statistical Learning Theory. New York: Springer-Verlag, 1995.
  • 2V. Cherkassky, F. Mulier. Learning from Data: Concept,Theory and Method. NY: John Viley & Sons, 1997.
  • 3O. Chapelle, V. M. Vapnik. Model selection for support vector machines. In: Proc. the 12th Conf. Neural Information Processing Systems. Cambridge, MA: MIT Press, 1999.
  • 4O. Chapelle, V. N. Vapnik, O. Bousquet, et al. Choosing multiple parameters for support vector machines. Machine Learning, 2002, 46(1): 131~159.
  • 5N. Cristianini, J. Shawe-Taylor, J. Kandola, et al. On kernel target alignment. In: Proc. Neural Information Processing Systems. Cambridge, MA: MIT Press, 2002. 367~373.
  • 6K. Tsuda, G. Ratsch, S. Mika, et al. Learning to predict the leave-one-out error of kernel based classifiers. In: Proc. 2001Int'l Conf. Artificial Neural Networks-ICANN 2001. Berlin:Springer-Verlag, 2001.
  • 7田盛丰,黄厚宽.基于支持向量机的数据库学习算法[J].计算机研究与发展,2000,37(1):17-22. 被引量:53
  • 8刘学军,陈松灿,彭宏京.基于支持向量机的计算机键盘用户身份验真[J].计算机研究与发展,2002,39(9):1082-1086. 被引量:26

共引文献48

同被引文献17

引证文献2

二级引证文献12

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部