期刊文献+
共找到1篇文章
< 1 >
每页显示 20 50 100
An Accelerated Stochastic Mirror Descent Method
1
作者 bo-ou jiang Ya-Xiang Yuan 《Journal of the Operations Research Society of China》 EI 2024年第3期549-571,共23页
Driven by large-scale optimization problems arising from machine learning,the development of stochastic optimization methods has witnessed a huge growth.Numerous types of methods have been developed based on vanilla s... Driven by large-scale optimization problems arising from machine learning,the development of stochastic optimization methods has witnessed a huge growth.Numerous types of methods have been developed based on vanilla stochastic gradient descent method.However,for most algorithms,convergence rate in stochastic setting cannot simply match that in deterministic setting.Better understanding the gap between deterministic and stochastic optimization is the main goal of this paper.Specifically,we are interested in Nesterov acceleration of gradient-based approaches.In our study,we focus on acceleration of stochastic mirror descent method with implicit regularization property.Assuming that the problem objective is smooth and convex or strongly convex,our analysis prescribes the method parameters which ensure fast convergence of the estimation error and satisfied numerical performance. 展开更多
关键词 Large-scale optimization Variance reduction Mirror descent Acceleration Independent sampling Importance sampling
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部