期刊文献+
共找到1篇文章
< 1 >
每页显示 20 50 100
N-SVRG:Stochastic Variance Reduction Gradient with Noise Reduction Ability for Small Batch Samples
1
作者 haijie pan Lirong Zheng 《Computer Modeling in Engineering & Sciences》 SCIE EI 2022年第4期493-512,共20页
The machine learning model converges slowly and has unstable training since large variance by random using a sample estimate gradient in SGD.To this end,we propose a noise reduction method for Stochastic Variance Redu... The machine learning model converges slowly and has unstable training since large variance by random using a sample estimate gradient in SGD.To this end,we propose a noise reduction method for Stochastic Variance Reduction gradient(SVRG),called N-SVRG,which uses small batches samples instead of all samples for the average gradient calculation,while performing an incremental update of the average gradient.In each round of iteration,a small batch of samples is randomly selected for the average gradient calculation,while the average gradient is updated by rounding of the past model gradients during internal iterations.By suitably reducing the batch size B,the memory storage as well as the number of iterations can be reduced.The experiments are compared with the state-of-the-art Mini-Batch SGD,AdaGrad,RMSProp,SVRG and SCSG,and it is demonstrated that N-SVRG outperforms SVRG and SASG,and is on par with SCSG.Finally,by exploring the relationship between the small values of different parameters n.B and k and the effectiveness of the algorithm,we prove that ourN-SVRG algorithm has some stability and can achieve sufficient accuracy even in the case of small batch size.The advantages and disadvantages of various methods are experimentally compared,and the stability of N-SVRG is explored by parameter settings. 展开更多
关键词 Machine learning SGD SVRG memory storage
下载PDF
上一页 1 下一页 到第
使用帮助 返回顶部