期刊文献+
共找到1篇文章
< 1 >
每页显示 20 50 100
A Mini-Batch Proximal Stochastic Recursive Gradient Algorithm with Diagonal Barzilai–Borwein Stepsize 被引量:1
1
作者 Teng-Teng Yu Xin-Wei Liu +1 位作者 Yu-Hong Dai Jie Sun 《Journal of the Operations Research Society of China》 EI CSCD 2023年第2期277-307,共31页
Many machine learning problems can be formulated as minimizing the sum of a function and a non-smooth regularization term.Proximal stochastic gradient methods are popular for solving such composite optimization proble... Many machine learning problems can be formulated as minimizing the sum of a function and a non-smooth regularization term.Proximal stochastic gradient methods are popular for solving such composite optimization problems.We propose a minibatch proximal stochastic recursive gradient algorithm SRG-DBB,which incorporates the diagonal Barzilai–Borwein(DBB)stepsize strategy to capture the local geometry of the problem.The linear convergence and complexity of SRG-DBB are analyzed for strongly convex functions.We further establish the linear convergence of SRGDBB under the non-strong convexity condition.Moreover,it is proved that SRG-DBB converges sublinearly in the convex case.Numerical experiments on standard data sets indicate that the performance of SRG-DBB is better than or comparable to the proximal stochastic recursive gradient algorithm with best-tuned scalar stepsizes or BB stepsizes.Furthermore,SRG-DBB is superior to some advanced mini-batch proximal stochastic gradient methods. 展开更多
关键词 Stochastic recursive gradient Proximal gradient algorithm Barzilai-Borwein method Composite optimization
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部