In this study, we propose and compare stochastic variants of the extra-gradient alternating direction method, named the stochastic extra-gradient alternating direction method with Lagrangian function(SEGL) and the s...In this study, we propose and compare stochastic variants of the extra-gradient alternating direction method, named the stochastic extra-gradient alternating direction method with Lagrangian function(SEGL) and the stochastic extra-gradient alternating direction method with augmented Lagrangian function(SEGAL), to minimize the graph-guided optimization problems, which are composited with two convex objective functions in large scale.A number of important applications in machine learning follow the graph-guided optimization formulation, such as linear regression, logistic regression, Lasso, structured extensions of Lasso, and structured regularized logistic regression. We conduct experiments on fused logistic regression and graph-guided regularized regression. Experimental results on several genres of datasets demonstrate that the proposed algorithm outperforms other competing algorithms, and SEGAL has better performance than SEGL in practical use.展开更多
基金supported by the National Natural Science Foundation of China(No.61303264)the National Key Research and Development Program of China(No.2016YFB1000401)
文摘In this study, we propose and compare stochastic variants of the extra-gradient alternating direction method, named the stochastic extra-gradient alternating direction method with Lagrangian function(SEGL) and the stochastic extra-gradient alternating direction method with augmented Lagrangian function(SEGAL), to minimize the graph-guided optimization problems, which are composited with two convex objective functions in large scale.A number of important applications in machine learning follow the graph-guided optimization formulation, such as linear regression, logistic regression, Lasso, structured extensions of Lasso, and structured regularized logistic regression. We conduct experiments on fused logistic regression and graph-guided regularized regression. Experimental results on several genres of datasets demonstrate that the proposed algorithm outperforms other competing algorithms, and SEGAL has better performance than SEGL in practical use.