期刊文献+

Novel linear search for support vector machine parameter selection 被引量:2

Novel linear search for support vector machine parameter selection
原文传递
导出
摘要 Selecting the optimal parameters for support vector machine (SVM) has long been a hot research topic. Aiming for support vector classification/regression (SVC/SVR) with the radial basis function (RBF) kernel, we summarize the rough line rule of the penalty parameter and kernel width, and propose a novel linear search method to obtain these two optimal parameters. We use a direct-setting method with thresholds to set the epsilon parameter of SVR. The proposed method directly locates the right search field, which greatly saves computing time and achieves a stable, high accuracy. The method is more competitive for both SVC and SVR. It is easy to use and feasible for a new data set without any adjustments, since it requires no parameters to set. Selecting the optimal parameters for support vector machine (SVM) has long been a hot research topic. Aiming for support vector classification/regression (SVC/SVR) with the radial basis function (RBF) kernel, we summarize the rough line rule of the penalty parameter and kernel width, and propose a novel linear search method to obtain these two optimal parameters. We use a direct-setting method with thresholds to set the epsilon parameter of SVR. The proposed method directly locates the right search field, which greatly saves computing time and achieves a stable, high accuracy. The method is more competitive for both SVC and SVR. It is easy to use and feasible for a new data set without any adjustments, since it requires no parameters to set.
出处 《Journal of Zhejiang University-Science C(Computers and Electronics)》 SCIE EI 2011年第11期885-896,共12页 浙江大学学报C辑(计算机与电子(英文版)
基金 supported by the National Basic Research Program (973) of China (No. 2009CB724006) the National Natural Science Foun-dation of China (No. 60977010)
  • 相关文献

参考文献23

  • 1Alan, J., 2010. A stable hyperparameter selection for the Gaussian RBF kernel for discrimination. Statist. Anal. Data Min., 3(3):142-148. [doi:10.1002/sam.10073].
  • 2Bengio, Y., 2000. Gradient-based optimization of hyper- parameters. Neur. Comput., 12(8):1889-1900. [doi:10. 1162/089976600300015187].
  • 3Bi, J., Bennett, K.P., 2003. A geometric approach to support vector regression. Neurocomputing, 55(1-2):79-108.[doi:10.1016/S0925-2312(03)00380-1].
  • 4Chapelle, O., Vapnik, V., Bousquet, O., Mukherjee, S., 2002. Choosing multiple parameters for support vector ma- chines. Mach. Learn., 46(1/3):131-159. [doi:]0.1023/A: 1012450327387].
  • 5Cherkassky, V., Ma, Y.Q., 2004. Practical selection of SVM parameters and noise estimation for SVM regression. Neur. Networks, 17(1):113-126. [doi:10.1016/90893- 6080(03)00169-2].
  • 6Cristianini, N., Kandola, J., Elisseeff, A., Shawe-Taylor, J., 2006. On kernel target alignment. Innov. Mach. Learn., 194:205-256. [doi: 10.1007/3-540-33486-6_8].
  • 7Deng, N.Y., Tian, Y.G., 2004. A New Method in Data Min- ing-Support Vector Machine. Science Press, Beijing, China (in Chinese).
  • 8Duan, K., Keerthi, S.S., Poo, A.N., 2003. Evaluation of simple performance measures for tuning SVM hyperparameters. Neurocomputing, 51:41-59. [doi:10.1016/S0925-2312(02) 00601-X].
  • 9Gijsberts, A., Metta, G., Rothkrantz, L., 2010. Evolutionary optimization of least-squares support vector machines. Data Min., 8(4):277-297. [doi:10.1007/978-1-4419-1280-0_12].
  • 10Huang, C.L., Dun, J.F., 2008. A distributed PSO-SVM hybrid system with feature selection and parameter optimization.AppI. Soft Comput., 8(4):1381-1391. [doi:10.1016/j.asoc. 2007.10.007].

同被引文献14

引证文献2

二级引证文献19

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部