摘要
随机变分推理(SVI)已被成功应用于在包括主题模型在内的众多类型的模型。虽然它将推理问题映射到涉及随机梯度的优化问题,使其扩展到处理大规模数据集,但是SVI算法中随机梯度固有的噪声使其产生较大的方差,阻碍了快速收敛。为此,对SVI作出改进,提出一种方差减小的SVI(VR-SVI)算法。首先,采取滑动窗口的方法重新计算随机梯度中的噪声项,构建新的随机梯度,减少了噪声对随机梯度的影响;然后,对提出的算法可在SVI基础上使得随机梯度的方差减小进行证明;最后,讨论窗口大小对算法的影响,并分析算法的收敛性。实验结果表明,VRSVI算法既减小了随机梯度的方差,又节省了计算时间,可达到快速收敛的效果。
Stochastic Variational Inference(SVI) has been successfully applied to many types of models including topic models. Although it is extended to deal with large-scale data set with mapping the problem of reasoning to the optimization problems involving random gradient, the inherent noise of the stochastic gradient in SVI algorithm makes it produce large variance, which hinders fast convergence. In order to solve the problem, an improved Variance Reduced SVI(VR-SVI) was proposed. Firstly, the sliding window method was used to recalculate the noise term in the stochastic gradient, a new stochastic gradient was constructed, and the influence of noise on the stochastic gradient was reduced. Then, it was proved that the proposed algorithm could reduce the variance of random gradient on the basis of SVI. Finally, the influence of window size on the algorithm was discussed, and the convergence of algorithm was analyzed. The experimental results show that, the proposed VR-SVI algorithm can not only reduce the variance of stochastic gradient, but also save the computation time and achieve fast convergence.
作者
刘张虎
程春玲
LIU Zhanghu , CHENG Chunling(School of Computer Science, Nanjing University of Posts and Telecommunications, Nanjing Jiangsu 210003, Chin)
出处
《计算机应用》
CSCD
北大核心
2018年第6期1675-1681,共7页
journal of Computer Applications
关键词
随机变分推理
滑动窗口
随机梯度
方差减小
主题建模
Stochastic Variational Inference (SVI)
sliding window
stochastic gradient
variance reduction
topic modeling