期刊文献+

加权稳健支撑向量回归方法 被引量:13

Reweighted Robust Support Vector Regression Method
下载PDF
导出
摘要 给出一类基于奇异值软剔除的加权稳健支撑向量回归方法(WRSVR).该方法的基本思想是首先由支撑向量回归方法(SVR)得到一个近似支撑向量回归函数,基于这个近似模型给出了加权SVR目标函数并利用高效的SVR求解技巧得到一个新的近似模型,然后再利用这个新的近似模型重新给出一个加权SVR目标函数并求解得到一个更为精确的近似模型,重复这一过程直至收敛.加权的目的是为了对奇异值进行软剔除.该方法具有思路简捷、稳健性强、容易实现等优点.实验表明,新算法WRSVR比标准SVR方法、稳健支撑向量网(RSVR)方法和加权最小二乘支撑向量机方法(WLSSVM)更加稳健,算法的逼近精度受奇异值的影响远小于SVM、RSVR和WLSSVM算法. A reweighted robust support vector machine (WRSVR) is presented for regression estimation and function approximation by soft pruning of outliers. The essential idea of the approach is to reweight regularization parameters to softly prune the outliers. Fundamentally, the regression function is improved iteratively. First, an approximate regression function is obtained using support vector regression (SVR), and then a weighted SVR objective function is generated and a new approximate regression function is got by solving the corresponding minimization problem. Based on the new approximate regression function, another weighted SVR objective function is found and a new approximate regression function is obtained. This procedure is repeated until it converges. The new approach is conceptually simple, insensitive to outliers and easy to implement. Numerical simulations show that it is more robust than the currently used Standard SVR, Robust Support Vector Regression Networks and Weighted Least Square SVM.
作者 张讲社 郭高
出处 《计算机学报》 EI CSCD 北大核心 2005年第7期1171-1177,共7页 Chinese Journal of Computers
基金 国家自然科学基金(60373106) 国家"八六三"高技术研究发展计划项目基金(2001AA113182)资助~~
关键词 支撑向量机 稳健支撑向量回归方法 奇异值 软剔除 统计学习 Computer simulation Least squares approximations Numerical analysis Robustness (control systems) Statistical methods Vector quantization
  • 相关文献

参考文献7

  • 1张学工译.统计学习理论的本质[M].北京:清华大学出版社,2000..
  • 2Smola A., Scholkopf B.. A tutorial on support vector regression. Royal Holloway College, London, U.K.: Neuro COLT Technical Report TR-1998-030, 1998
  • 3Duan K., Keerthi S.S., Poo A.N.. Evaluation of simple performance measures for tunning SVM hyperparameters. National University of Singapore, Singapore: Technical Report CD-01-11, 2001
  • 4Suykens J.A.K., de Brabanter J., Lukas L., Vandewalle J.. Weighted least squares support vector machines: Robustness and sparse approximation. Neurocomputing, 2002, 48(1~4): 85~105
  • 5Chuang C.-C., Su F.-F., Jeng J.-T., Hsiao C.-C.. Robust support vector regression networks for function approximation with outliers. IEEE Transactions on Neural Networks, 2002, 13(6): 1322~1330
  • 6Platt J.C.. Fast training of support vector machines using sequential optimization. In: Scholkopf B., Burges C., Smola A. eds. Advances in Kernel Methods: Support Vector Machines. Cambridge, MA: MIT Press, 1998, 185~208
  • 7Shevade S.K., Keerthi S.S., Bhattacharyya C., Murthy K.R.K.. Improvements to SMO algorithm for SVM regression. IEEE Transactions on Neural Networks, 2000, 11(5): 1188~1193

共引文献22

同被引文献135

引证文献13

二级引证文献80

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部