期刊文献+

增量式Huber-支持向量回归机算法研究 被引量:1

A Study of Incremental Huber-Support Vector Regression Algorithm
下载PDF
导出
摘要 传统的面向支持向量回归的一次性建模算法中样本增加时,均需从头开始学习,而增量式算法可以充分利用上一阶段的学习成果。SVR的增量算法通常基于ε-不敏感损失函数,该损失函数对大的异常值比较敏感,而Huber损失函数对异常值敏感度低。所以在有噪声的情况下,Huber损失函数是比ε-不敏感损失函数更好的选择,在现实情况当中。基于此,本文提出了一种基于Huber损失函数的增量式Huber-SVR算法,该算法能够持续地将新样本信息集成到已经构建好的模型中,而不是重新建模。与增量式ε-SVR算法和增量式RBF算法相比,在对真实数据进行预测建模时,增量式Huber-SVR算法具有更高的预测精度。 In the traditional support vector regression-oriented one-time modeling algorithm,when the number of samples increases,it is necessary to start from scratch,while the incremental algorithm can make full use of the learning results of the previous stage.The incremental algorithm of SVR is usually based on theε-insensitive loss function,which is more sensitive to large outliers,while the Huber loss function is less sensitive to outliers.So in noisy situations,the Huber loss function is a better choice in real-world situations.Based on this,this paper proposes an incremental Huber-SVR algorithm,which can continuously integrate new sample information into the already constructed model instead of remodeling.Compared with the incrementalε-SVR algorithm and the incremental RBF algorithm,the incremental Huber-SVR algorithm has higher prediction accuracy when performing predictive modeling on real data.
作者 周晓剑 肖丹 付裕 ZHOU Xiao-jian;XIAO Dan;FU Yu(School of Management,Nanjing University of Posts and Telecommunications,Nanjing 210023,China;School of Information,Xiamen University,Xiamen 361005,China)
出处 《运筹与管理》 CSSCI CSCD 北大核心 2022年第8期137-142,共6页 Operations Research and Management Science
基金 国家自然科学基金资助项目(71872088) 江苏省自然科学基金资助(BK20190793)。
关键词 增量算法 支持向量回归机 Huber损失函数 incremental algorithm support vector regression Huber loss function
  • 相关文献

参考文献7

二级参考文献42

  • 1朱嘉钢,王士同.Huber-SVR中参数μ与输入噪声间关系的研究[J].复旦学报(自然科学版),2004,43(5):793-796. 被引量:6
  • 2张浩然,汪晓东.回归最小二乘支持向量机的增量和在线式学习算法[J].计算机学报,2006,29(3):400-406. 被引量:111
  • 3肖文兵,费奇.基于支持向量机的个人信用评估模型及最优参数选择研究[J].系统工程理论与实践,2006,26(10):73-79. 被引量:47
  • 4Gao J B, Gunn S R, Ham's C J. A probabilistic framework for SVM regression and error bar estimation[J]. Machine Learning, 2002,46:71-89.
  • 5Kwok J T, Tsang I W. Linear dependency between and the input noise in -support vector regression[J]. IEEE Transaction on Neural Networks, 2003,14(3):544-53.
  • 6Smola A J,Schlkopf B. A tutotial on support vector regression[R]. Neuro COLT2 Technical Report NC2-TR-1998-030, Royal Holloway College, 1998.
  • 7Gao J B,Gunn S R,Ham's C J.A probabilistic framework for SVM regression and Error Bar Estimation[J].Machine Learning,2002,46:71~89
  • 8Kwok J T.The evidence framework applied to support vector machines[J].IEEE Transactions on Neural Networks,2000,11(5):1162~1173
  • 9Kwok J T,Tsang I W.Linear dependency between ε and the input noise in ε-support vector regression.IEEE Transaction on Neural Networks[J],2003,14 (3):544~553
  • 10Wang Shitong,Zhu Jiagang,F.L.Chung,et al.Theoretically optimal parameter choices for support vector regression machines Huber-SVR and Norm_r-SVR with noisy input.Soft Computing[J],2005,9(10):732~741

共引文献48

同被引文献9

引证文献1

二级引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部