In recent years,alternating direction method of multipliers(ADMM)and its variants are popular for the extensive use in image processing and statistical learning.A variant of ADMM:symmetric ADMM,which updates the Lagra...In recent years,alternating direction method of multipliers(ADMM)and its variants are popular for the extensive use in image processing and statistical learning.A variant of ADMM:symmetric ADMM,which updates the Lagrange mul-tiplier twice in one iteration,is always faster whenever it converges.In this paper,combined with Nesterov’s accelerating strategy,an accelerated symmetric ADMM is proposed.We prove its O(1/k^(2))convergence rate under strongly convex condition.For the general situation,an accelerated method with a restart rule is proposed.Some preliminary numerical experiments show the efficiency of our algorithms.展开更多
基金This research is partly supported by the National Natural Sci-ence Foundation of China(Grant No.11671217)Natural Science Foundation of Xinjiang(Grant No.2017D01A14)。
文摘In recent years,alternating direction method of multipliers(ADMM)and its variants are popular for the extensive use in image processing and statistical learning.A variant of ADMM:symmetric ADMM,which updates the Lagrange mul-tiplier twice in one iteration,is always faster whenever it converges.In this paper,combined with Nesterov’s accelerating strategy,an accelerated symmetric ADMM is proposed.We prove its O(1/k^(2))convergence rate under strongly convex condition.For the general situation,an accelerated method with a restart rule is proposed.Some preliminary numerical experiments show the efficiency of our algorithms.