期刊文献+

精确增量式ε型孪生支持向量回归机 被引量:1

Accurate incrementalε-twin support vector regression
下载PDF
导出
摘要 为了解决现有ε型孪生支持向量回归机的训练算法无法高效处理线性回归的增量学习问题,提出了一种精确增量式ε型孪生支持向量回归机(AIETSVR).首先通过计算新增样本的拉格朗日乘子以及调整边界样本的拉格朗日乘子,尽可能减少新增样本的二次损失对原有样本的影响,使得大部分原有样本依然满足Karush–Kuhn–Tucker(KKT)条件,从而获得一个有效的初始状态;其次对异常拉格朗日乘子逐步调整至满足KKT条件;然后从理论上分析了AIETSVR的可行性和有限收敛性;最后在基准测试数据集上进行仿真.结果表明,与现有的代表性算法相比,AIETSVR能够获得精确解,在缩短大规模数据集的训练时间上优势显著. Inε-twin support vector regression,to solve the problem that the existing algorithms can not efficiently deal with the incremental learning for linear regression,an accurate incrementalε-twin support vector regression(AIETSVR)is proposed.First,by calculating the Lagrangian multiplier of the new sample and adjusting the Lagrangian multipliers of the boundary samples,the influence generated by the quadratic loss of the new sample on the existing samples is minimized.Therefore,most of the existing samples still meet the Karush–Kuhn–Tucker(KKT)conditions,and a valid initial state is obtained.Then,the exceptional Lagrangian multipliers are gradually adjusted to conform to the KKT conditions.Next,the feasibility and finite convergence of AIETSVR are theoretically analyzed.Finally,the simulation is conducted on benchmark datasets.Compared with the existing representative algorithms,the results show that AIETSVR can obtain accurate solutions and has a great advantage in shortening training time for large-scale dataset.
作者 曹杰 顾斌杰 潘丰 熊伟丽 CAO Jie;GU Bin-jie;PAN Feng;XIONG Wei-li(Jiangnan University,Key Laboratory of Advanced Process Control for Light Industry,Ministry of Education,Wuxi Jiangsu 214122,China)
出处 《控制理论与应用》 EI CAS CSCD 北大核心 2022年第6期1020-1032,共13页 Control Theory & Applications
基金 国家自然科学基金项目(61773182)资助。
关键词 机器学习 增量学习 在线学习 孪生支持向量回归机 学习算法 可行性分析 有限收敛性分析 machine learning incremental learning online learning twin support vector regression learning algorithms feasibility analysis finite convergence analysis
  • 相关文献

参考文献4

二级参考文献42

  • 1Vapnik V. Statistical learning theory. In: Wiley Series on Adaptive and Learning Systems for Signal Processing Communications and Control. John Wiley & Sons, 1998.
  • 2Lin CL On the convergence of the decomposition method for support vector machines. IEEE Trans. on Neural Network, 2001,12(6): 1288-1298. [doi: 10.1109/72.963765].
  • 3Friel3 TT, Cristianini N, Campbell C. The kemel-adatron algorithm: A fast and simple learning procedure for support vector machines. In: Proc. of the ICML'98. San Francisco: Morgan Kaufmann Publishers, 1998. 188-196. http://www.isn.ucsd.du/ pubs/nips00_inc.pdf.
  • 4Tsang IW, Kwok JT, Lai KT. Core vector regressionfor very large regression problems. In: Proc. of the ICML 2005. 2005 912-919.[doi: 10.1145/1102351.1102466].
  • 5Zhang T. Solving large scale linear prediction problems using stochastic gradient descent algorithms. In: Proc. ofthe ICML 2004 2004.919-926.[doi: 10.1145/1015330.1015332].
  • 6Lee S, Wright SJ. ASSET: Approximate stochastic subgradient estimation training for support vector machines. In: Proc. of the lnt'l Conf. on Pattern Recognition Applications and Methods. 2012. 223-228. http://pages.cs,wisc.edu/sldee/papers/asset_ icpram 12.pdf.
  • 7Mangasarian OL, Musicant DR. Successive overrelaxation for support vector machines. IEEE Trans. on Neural Networks, 1999,10: 1032-1037. [doi: 10.1109/72.788643].
  • 8Joachims T, Finley T, Yu CNJ. Cutting-Plane training of structural SVMs. Machine Learning, 2009,77(1):27-59. [doi: 10.1007/ s 10994-009-5108-8].
  • 9Teo CH, Vishwanthan SVN, Smola A J, Le QV. Bundle methods for regularized risk minimization. The Journal of Machine Learning Research, 2010,1 l(1):311-365.
  • 10Franc V, Sonnenburg S. Optimized cutting plane algorithm for support vector machines. In: Proc. of the Int'l Conf. on Machine Learning. 2008. 320-327. [doi: I0.1145/1390156.1390197].

共引文献45

同被引文献5

引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部