The alternating direction method of multipliers(ADMM)is widely used in solving structured convex optimization problems.Despite its success in practice,the convergence of the standard ADMM for minimizing the sum of N(N...The alternating direction method of multipliers(ADMM)is widely used in solving structured convex optimization problems.Despite its success in practice,the convergence of the standard ADMM for minimizing the sum of N(N≥3)convex functions,whose variables are linked by linear constraints,has remained unclear for a very long time.Recently,Chen et al.(Math Program,doi:10.1007/s10107-014-0826-5,2014)provided a counter-example showing that the ADMM for N≥3 may fail to converge without further conditions.Since the ADMM for N≥3 has been very successful when applied to many problems arising from real practice,it is worth further investigating under what kind of sufficient conditions it can be guaranteed to converge.In this paper,we present such sufficient conditions that can guarantee the sublinear convergence rate for the ADMM for N≥3.Specifically,we show that if one of the functions is convex(not necessarily strongly convex)and the other N-1 functions are strongly convex,and the penalty parameter lies in a certain region,the ADMM converges with rate O(1/t)in a certain ergodic sense and o(1/t)in a certain non-ergodic sense,where t denotes the number of iterations.As a by-product,we also provide a simple proof for the O(1/t)convergence rate of two-blockADMMin terms of both objective error and constraint violation,without assuming any condition on the penalty parameter and strong convexity on the functions.展开更多
We focus on the convergence analysis of the extended linearized alternating direction method of multipliers(L-ADMM)for solving convex minimization problems with three or more separable blocks in the objective function...We focus on the convergence analysis of the extended linearized alternating direction method of multipliers(L-ADMM)for solving convex minimization problems with three or more separable blocks in the objective functions.Previous convergence analysis of the L-ADMM needs to reduce the multi-block convex minimization problems to two blocks by grouping the variables.Moreover,there has been no rate of convergence analysis for the L-ADMM.In this paper,we construct a counter example to show the failure of convergence of the extended L-ADMM.We prove the convergence and establish the sublinear convergence rate of the extended L-ADMM under the assumptions that the proximal gradient step sizes are smaller than certain values,and any two coefficient matrices in linear constraints are orthogonal.展开更多
基金The research of S.-Q.Ma was supported in part by the Hong Kong Research Grants Council General Research Fund Early Career Scheme(No.CUHK 439513)The research of S.-Z.Zhang was supported in part by the National Natural Science Foundation(No.CMMI 1161242).
文摘The alternating direction method of multipliers(ADMM)is widely used in solving structured convex optimization problems.Despite its success in practice,the convergence of the standard ADMM for minimizing the sum of N(N≥3)convex functions,whose variables are linked by linear constraints,has remained unclear for a very long time.Recently,Chen et al.(Math Program,doi:10.1007/s10107-014-0826-5,2014)provided a counter-example showing that the ADMM for N≥3 may fail to converge without further conditions.Since the ADMM for N≥3 has been very successful when applied to many problems arising from real practice,it is worth further investigating under what kind of sufficient conditions it can be guaranteed to converge.In this paper,we present such sufficient conditions that can guarantee the sublinear convergence rate for the ADMM for N≥3.Specifically,we show that if one of the functions is convex(not necessarily strongly convex)and the other N-1 functions are strongly convex,and the penalty parameter lies in a certain region,the ADMM converges with rate O(1/t)in a certain ergodic sense and o(1/t)in a certain non-ergodic sense,where t denotes the number of iterations.As a by-product,we also provide a simple proof for the O(1/t)convergence rate of two-blockADMMin terms of both objective error and constraint violation,without assuming any condition on the penalty parameter and strong convexity on the functions.
基金supported by the National Natural Science Foundation of China(No.61179033).
文摘We focus on the convergence analysis of the extended linearized alternating direction method of multipliers(L-ADMM)for solving convex minimization problems with three or more separable blocks in the objective functions.Previous convergence analysis of the L-ADMM needs to reduce the multi-block convex minimization problems to two blocks by grouping the variables.Moreover,there has been no rate of convergence analysis for the L-ADMM.In this paper,we construct a counter example to show the failure of convergence of the extended L-ADMM.We prove the convergence and establish the sublinear convergence rate of the extended L-ADMM under the assumptions that the proximal gradient step sizes are smaller than certain values,and any two coefficient matrices in linear constraints are orthogonal.