期刊文献+

基于随机素描方法的在线核回归

Online kernel regression based on random sketching method
下载PDF
导出
摘要 在线核回归学习中,每当一个新的样本到来,训练器都需要计算核矩阵的逆矩阵,这个过程的计算复杂度至少为关于回合数的平方级别。提出将素描方法应用于假设的更新,给出一个基于素描方法的更高效的在线核回归算法。首先,将损失函数设定为平方损失,应用Nystr?m近似方法来近似核,并借鉴跟导方法(FTL)的思想,提出一个新的梯度下降算法,称之为FTL-在线核回归(F-OKR);然后,应用素描方法对其加速,使得F-OKR的计算复杂度降低到关于回合数和素描规模线性、关于数据维度平方的级别;最后,设计了一个高效的素描在线核回归算法(SOKR)。与F-OKR相比,SOKR的精度几乎没有影响,而同时在适当的数据集上,运行时间减少16.7%左右。在理论上证得了两种算法的亚线性后悔界。实验结果也验证了所提算法与Nystr?m在线梯度下降算法(NOGD)相比有更好的表现,平均损失降低约64%。 In online kernel regression learning,the inverse matrix of the kernel matrix needs to be calculated when a new sample arrives,and the computational complexity is at least the square of the number of rounds.The idea of applying sketching method to hypothesis updating was introduced,and a more efficient online kernel regression algorithm via sketching method was proposed.Firstly,The loss function was set as the square loss,a new gradient descent algorithm,called FTL-Online Kernel Regression(F-OKR)was proposed,using the Nystr?m approximation method to approximate the Kernel,and applying the idea of Follow-The-Leader(FTL).Then,sketching method was used to accelerate F-OKR so that the computational complexity of F-OKR was reduced to the level of linearity with the number of rounds and sketch scale,and square with the data dimension.Finally,an efficient online kernel regression algorithm called Sketched Online Kernel Regression(SOKR)was designed.Compared to F-OKR,SOKR had no change in accuracy and reduced the runtime by about 16.7% on some datasets.The sub-linear regret bounds of these two algorithms were proved,and experimental results on standard regression datasets also verify that the algorithms have better performance than NOGD(Nystr?m Online Gradient Descent)algorithm,the average loss of all the datasets was reduced by about 64%.
作者 刘清华 廖士中 LIU Qinghua;LIAO Shizhong(College of Intelligence and Computing,Tianjin University,Tianjin 300350,China)
出处 《计算机应用》 CSCD 北大核心 2022年第3期676-682,共7页 journal of Computer Applications
基金 国家自然科学基金资助项目(62076181)。
关键词 在线学习 素描方法 后悔分析 回归 核方法 online learning sketching method regret analysis regression kernel method
  • 相关文献

参考文献3

二级参考文献64

  • 1许冠英,韩萌,王少峰,贾涛.数据流集成分类算法综述[J].计算机应用研究,2020,37(1):1-8. 被引量:11
  • 2Nature. Big data [EB/OL]. [ 2012-10-02 ]. http://www. nature, com/news/Specials/bigdata/index, html.
  • 3Science. Special online collection: Dealing with data [EB/OL]. [2012-10-02]. http: //www. sciencemag, org/site/ speclal/data.
  • 4杨海钦,吕荣聪,金国庆.面向大数据的在线学习算法[J].中国计算机学会通讯.2014,10(11):36-40.
  • 5Rosenblatt F. The perceptron: A probabilistic model for information storage and organization in the brain [J]. Psychological Review, 1958, 65(6): 386-408.
  • 6Crammer K, Dekel O, Keshet J, et al. Online passive- aggressive algorithms [J]. Journal of Machine Learning Research, 20061 7(3): 551-585.
  • 7Langford J, Li Lihong, Zhang Tong. Sparse online learning via truncated gradient [J]. Journal of Machine Learning Research, 2009, 10(3) : 777-801.
  • 8Duchi J, Singer Y. Efficient online and batch learning using forward backward splitting [J]. Journal of Maching Learning Research, 2009, 10(12): 2899-2934.
  • 9Xiao L. Dual averaging methods for regularized stochastic learning and online optimization [J]. Journal of Machine Learning Research, 2010, 11(10): 2543-2596.
  • 10Yang Haiqin, Lyu M R, King I. Efficient online learning for multi-task feature selection [J]. ACM Trans on Knowledge Discovery from Data, 2013, 1(1) ; 1-28.

共引文献68

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部