摘要
随机优化方法已经成为处理大规模正则化和深度学习优化问题的首选方法,其收敛速率的获得通常都建立在目标函数梯度无偏估计的基础上,但对机器学习问题来说,很多现象都导致了梯度有偏情况的出现.与梯度无偏情形不同的是,著名的Nesterov加速算法NAG(Nesterov accelerated gradient)会逐步累积每次迭代中的梯度偏差,从而导致不能获得最优的收敛速率甚至收敛性都无法保证.近期的研究结果表明,NAG方法也是求解非光滑问题投影次梯度关于个体收敛的加速算法,但次梯度有偏对其影响的研究未见报道.针对非光滑优化问题,证明了在次梯度偏差有界的情况下,NAG能够获得稳定的个体收敛界,而当次梯度偏差按照一定速率衰减时,NAG仍然可获得最优的个体收敛速率.作为应用,得到了一种无需精确计算投影的投影次梯度方法,可以在保持收敛性的同时较快地达到稳定学习的精度.实验验证了理论分析的正确性及非精确方法的性能.
Stochastic method has become the first choice for dealing with large-scale regularization and deep learning optimization problems.The acquisition of its convergence rate heavily depends on the unbiased gradient of objective functions.However,for machine learning problems,many scenarios can result in the appearance of biased gradient.In contrast to the unbiased gradient cases,the well-known Nesterov accelerated gradient(NAG)accumulates the error caused by the bias with the iteration.As a result,the optimal convergence will no longer hold and even the convergence cannot be guaranteed.Recent research shows that NAG is also an accelerated algorithm for the individual convergence of projection sub-gradient methods in non-smooth cases.However,until now,there is no report about the affect when the subgradient becomes biased.In this study,for non-smooth optimization problems,it is proved that NAG can obtain a stable individual convergence bound when the subgradient bias is bounded,and the optimal individual convergence can still be achieved while the subgradient errors decrease at an appropriate.As an application,an inexact projection subgradient method is obtained in which the projection needs not calculate accurately.The derived algorithm can approach the stable learning accuracy more quick while keeping the convergence.The experiments verify the correctness of theoretical analysis and the performance of inexact methods.
作者
刘宇翔
程禹嘉
陶卿
LIU Yu-Xiang;CHENG Yu-Jia;TAO Qing(Department of Information Engineering,PLA Army Academy of Artillery and Air Defense,Hefei 230031,China)
出处
《软件学报》
EI
CSCD
北大核心
2020年第4期1051-1062,共12页
Journal of Software
基金
国家自然科学基金(61673394)。