期刊文献+

带有近似最优步长的随机递归梯度算法

Stochastic Recursive Gradient Algorithm with Approximately Optimal Stepsize
下载PDF
导出
摘要 在机器学习中,我们经常考虑一个目标函数是凸函数和的最小化问题。随机递归梯度算法(SARAH)是求解上面问题的一个常用方法。它允许一个简单的递归框架来更新随机梯度估计。基于SARAH方法,本文提出利用近似最优步长(AOS)去自适应地计算SARAH的步长,并将其命名为SARAH-AOS算法。针对提出的算法,我们进行了数值试验,结果表明SARAH-AOS算法对初始步长的选择并不像SARAH那样敏感。我们的算法对SARAH算法有着显著性能的改进。 In machine learning, we often consider the problem of minimizing an objective function that is a sum of convex functions. The stochastic recursive gradient algorithm (SARAH) is a common method to solve the above problems. It allows a simple recursive framework to update stochastic gradient estimates. Based on the SARAH method, this paper, we propose to use the approximately optimal stepsize (AOS) to automatically compute stepsizes for SARAH, which leads to SARAH-AOS algorithm. For the proposed algorithm, we conduct numerical experiments, and the results show that the SARAH-AOS algorithm is not as sensitive to the selection of initial step size as SARAH. Our algorithm has a significant performance improvement over the SARAH algorithm.
作者 陈炫睿
出处 《运筹与模糊学》 2023年第5期4318-4326,共9页 Operations Research and Fuzziology
  • 相关文献

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部