Alternating direction method of multipliers(ADMM)receives much attention in the recent years due to various demands from machine learning and big data related optimization.In 2013,Ouyang et al.extend the ADMM to the s...Alternating direction method of multipliers(ADMM)receives much attention in the recent years due to various demands from machine learning and big data related optimization.In 2013,Ouyang et al.extend the ADMM to the stochastic setting for solving some stochastic optimization problems,inspired by the structural risk minimization principle.In this paper,we consider a stochastic variant of symmetric ADMM,named symmetric stochastic linearized ADMM(SSL-ADMM).In particular,using the framework of variational inequality,we analyze the convergence properties of SSL-ADMM.Moreover,we show that,with high probability,SSL-ADMM has O((ln N)·N^(-1/2))constraint violation bound and objective error bound for convex problems,and has O((ln N)^(2)·N^(-1))constraint violation bound and objective error bound for strongly convex problems,where N is the iteration number.Symmetric ADMM can improve the algorithmic performance compared to classical ADMM,numerical experiments for statistical machine learning show that such an improvement is also present in the stochastic setting.展开更多
基金Supported by National Natural Science Foundation of China (61662036)。
文摘Alternating direction method of multipliers(ADMM)receives much attention in the recent years due to various demands from machine learning and big data related optimization.In 2013,Ouyang et al.extend the ADMM to the stochastic setting for solving some stochastic optimization problems,inspired by the structural risk minimization principle.In this paper,we consider a stochastic variant of symmetric ADMM,named symmetric stochastic linearized ADMM(SSL-ADMM).In particular,using the framework of variational inequality,we analyze the convergence properties of SSL-ADMM.Moreover,we show that,with high probability,SSL-ADMM has O((ln N)·N^(-1/2))constraint violation bound and objective error bound for convex problems,and has O((ln N)^(2)·N^(-1))constraint violation bound and objective error bound for strongly convex problems,where N is the iteration number.Symmetric ADMM can improve the algorithmic performance compared to classical ADMM,numerical experiments for statistical machine learning show that such an improvement is also present in the stochastic setting.