期刊文献+

迭代重加权最小二乘支持向量机快速算法研究 被引量:7

Study on the Fast Training Algorithm of Iteratively Re-weighted Least Squares Support Vector Machine
下载PDF
导出
摘要 迭代重加权(Iteratively Reweighted)方法是提高最小二乘支持向量机(LS-SVM)稳健性的重要手段,但由于涉及到多次加权和重复训练,该方法需要大量运算,无法广泛应用。通过数值推导,获得了求解迭代重加权最小二乘支持向量机(IRLS-SVM)的快速算法,大幅度减少了其运算复杂度。引入了3种经典的加权函数,并在多个仿真数据集和实际数据集上进行实验,证实了IRLS-SVM能获得相当稳健的学习结果,所提出的快速算法也确实能够大幅度减少训练时间。实验结果同时表明,在快速训练算法的框架下,3种不同的权重函数可能要求不同的训练时间。 Iteratively reweighted method is an important approach to improve the robustness of least squares support vector machine(LS-SVM). However, the reweighting and retraining procedure demands a lot of computational time, which makes it impossible for practical applications. In this paper, the iteratively reweighted least squares support vector machine (IRLS-SVM) was studied. An improved training algorithm of IRLS-SVM was proposed. It is based on novel numerical method, and can effectively reduce the computational complexity of IRLS-SVM. Three different weight functions were implemented in the IRLS-SVM. Experiments on simulated instances and real-world datasets demonstrate the validity of this algorithm. Meanwhile, the results reveal that different weight function may require different computational time for the fast training algorithm of IRLS-SVM.
出处 《计算机科学》 CSCD 北大核心 2010年第8期224-228,297,共6页 Computer Science
基金 信息安全国家重点实验室开放课题基金(20090401)资助
关键词 支持向量机 稳健性 异常样本 快速算法 Support vector machines, Robustness, Outliers, Fast algorithm
  • 相关文献

参考文献20

  • 1张学工.关于统计学习理论与支持向量机[J].自动化学报,2000,26(1):32-42. 被引量:2264
  • 2Vapnik V.The nature of statistical learning theory[M].John Wiley & Sons,New York,USA,1995.
  • 3Cao L J,Tay F E H.Support vector machine with adaptive parameters in financial time series forecasting[J].IEEE Transactions on Neural Networks,2003,14(6):1506-1518.
  • 4Burges C J C.A tutorial on support vector machines for pattern recognition[J].Data Mining and Knowledge Discovery,1998,2 (2):955-974.
  • 5Huber P J.Robust Statistics[M].John Wiley & Sons Inc.,1981.
  • 6Chirstmann A,Messem A V.Bouligand Derivatives and Robsutness of Support Vector Machines for Regression[J].Journal of Machine Learning Research,2008,9:915-936.
  • 7Smola A J,Scholkopf B.A Tutorial on Support Vector Regression[J].Statistics and computing,2004,3(14):199-222.
  • 8Steinwart I,Christmann A.How SVMs can Estimate Quantiles and the Median[C] ∥Advances in Neural Information Proces-sing Systems 2007.2008:305-312.
  • 9Suykens J A K,Vandewa J.Least Squares Support Vector Machine Classifiers[J].Neural Processing Letters,1999,9:293-300.
  • 10Suykens J A K,Brabanter J D,Lukas L,et al.Weighted Least Squares Support Vector Machines:Robustness and Sparse Approximation[J].Neurocomputing,2002,48:85-105.

二级参考文献8

  • 1张学工译.统计学习理论的本质[M].北京:清华大学出版社,2000..
  • 2Duan K., Keerthi S.S., Poo A.N.. Evaluation of simple performance measures for tunning SVM hyperparameters. National University of Singapore, Singapore: Technical Report CD-01-11, 2001
  • 3Suykens J.A.K., de Brabanter J., Lukas L., Vandewalle J.. Weighted least squares support vector machines: Robustness and sparse approximation. Neurocomputing, 2002, 48(1~4): 85~105
  • 4Chuang C.-C., Su F.-F., Jeng J.-T., Hsiao C.-C.. Robust support vector regression networks for function approximation with outliers. IEEE Transactions on Neural Networks, 2002, 13(6): 1322~1330
  • 5Platt J.C.. Fast training of support vector machines using sequential optimization. In: Scholkopf B., Burges C., Smola A. eds. Advances in Kernel Methods: Support Vector Machines. Cambridge, MA: MIT Press, 1998, 185~208
  • 6Shevade S.K., Keerthi S.S., Bhattacharyya C., Murthy K.R.K.. Improvements to SMO algorithm for SVM regression. IEEE Transactions on Neural Networks, 2000, 11(5): 1188~1193
  • 7Smola A., Scholkopf B.. A tutorial on support vector regression. Royal Holloway College, London, U.K.: Neuro COLT Technical Report TR-1998-030, 1998
  • 8卢增祥,李衍达.交互支持向量机学习算法及其应用[J].清华大学学报(自然科学版),1999,39(7):93-97. 被引量:40

共引文献2274

同被引文献51

引证文献7

二级引证文献31

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部