In recent years,alternating direction method of multipliers(ADMM)and its variants are popular for the extensive use in image processing and statistical learning.A variant of ADMM:symmetric ADMM,which updates the Lagra...In recent years,alternating direction method of multipliers(ADMM)and its variants are popular for the extensive use in image processing and statistical learning.A variant of ADMM:symmetric ADMM,which updates the Lagrange mul-tiplier twice in one iteration,is always faster whenever it converges.In this paper,combined with Nesterov’s accelerating strategy,an accelerated symmetric ADMM is proposed.We prove its O(1/k^(2))convergence rate under strongly convex condition.For the general situation,an accelerated method with a restart rule is proposed.Some preliminary numerical experiments show the efficiency of our algorithms.展开更多
The Kuhn-Tucker conditions have been used to derive many significant results in economics. However, thus far, their derivation has been a little bit troublesome. The author directly derives the Kuhn-Tucker conditions ...The Kuhn-Tucker conditions have been used to derive many significant results in economics. However, thus far, their derivation has been a little bit troublesome. The author directly derives the Kuhn-Tucker conditions by applying a corollary of Farkas’s lemma under the Mangasarian-Fromovitz constraint qualification and shows the boundedness of Lagrange multipliers.展开更多
It has been shown that the alternating direction method of multipliers(ADMM)is not necessarily convergent when it is directly extended to a multiple-block linearly constrained convex minimization model with an objecti...It has been shown that the alternating direction method of multipliers(ADMM)is not necessarily convergent when it is directly extended to a multiple-block linearly constrained convex minimization model with an objective function that is in the sum of more than two functions without coupled variables.Recently,we pro-posed the block-wise ADMM,which was obtained by regrouping the variables and functions of such a model as two blocks and then applying the original ADMM in block-wise.This note is a further study on this topic with the purpose of showing that a well-known relaxation factor proposed by Fortin and Glowinski for iteratively updat-ing the Lagrangian multiplier of the originalADMMcan also be used in the block-wise ADMM.We thus propose the block-wise ADMM with Fortin and Glowinski’s relax-ation factor for the multiple-block convex minimization model.Like the block-wise ADMM,we also suggest further decomposing the resulting subproblems and regular-izing them by proximal terms to ensure the convergence.For the block-wise ADMM with Fortin and Glowinski's relaxation factor,its convergence and worst-case conver-gence rate measured by the iteration complexity in the ergodic sense are derived.展开更多
文摘传统Takagi-Sugeno(T-S)模糊系统模型因模糊规则使用样本全部特征,导致模型的可解释性较差,冗余特征的存在还会导致模型的过拟合,降低模型的泛化性能。针对该问题,提出了一种模糊系统联合稀疏建模新方法L2-CFS-FIS(L2-common feature selection fuzzy inference systems),从而提高模型的泛化性能和可解释性。该方法充分考虑存在于模糊规则间的公共特征信息,同时引入模型过拟合处理机制,将模糊系统建模问题转化为一个基于双正则的联合优化问题,并使用交替方向乘子(alternating direction method of multipliers,ADMM)算法来进行求解。实验结果表明,该方法所构造的模糊系统不仅能够获得较为满意的泛化性能,而且通过有效地挖掘规则间重要的公共特征,可以确保模型具有较高的可解释性。
基金This research is partly supported by the National Natural Sci-ence Foundation of China(Grant No.11671217)Natural Science Foundation of Xinjiang(Grant No.2017D01A14)。
文摘In recent years,alternating direction method of multipliers(ADMM)and its variants are popular for the extensive use in image processing and statistical learning.A variant of ADMM:symmetric ADMM,which updates the Lagrange mul-tiplier twice in one iteration,is always faster whenever it converges.In this paper,combined with Nesterov’s accelerating strategy,an accelerated symmetric ADMM is proposed.We prove its O(1/k^(2))convergence rate under strongly convex condition.For the general situation,an accelerated method with a restart rule is proposed.Some preliminary numerical experiments show the efficiency of our algorithms.
文摘The Kuhn-Tucker conditions have been used to derive many significant results in economics. However, thus far, their derivation has been a little bit troublesome. The author directly derives the Kuhn-Tucker conditions by applying a corollary of Farkas’s lemma under the Mangasarian-Fromovitz constraint qualification and shows the boundedness of Lagrange multipliers.
基金Bing-Sheng He and Ming-Hua Xu were supported by the National Natural Science Foundation of China(No.11471156)Xiao-Ming Yuan was supported by the General Research Fund from Hong Kong Research Grants Council(No.HKBU 12313516).
文摘It has been shown that the alternating direction method of multipliers(ADMM)is not necessarily convergent when it is directly extended to a multiple-block linearly constrained convex minimization model with an objective function that is in the sum of more than two functions without coupled variables.Recently,we pro-posed the block-wise ADMM,which was obtained by regrouping the variables and functions of such a model as two blocks and then applying the original ADMM in block-wise.This note is a further study on this topic with the purpose of showing that a well-known relaxation factor proposed by Fortin and Glowinski for iteratively updat-ing the Lagrangian multiplier of the originalADMMcan also be used in the block-wise ADMM.We thus propose the block-wise ADMM with Fortin and Glowinski’s relax-ation factor for the multiple-block convex minimization model.Like the block-wise ADMM,we also suggest further decomposing the resulting subproblems and regular-izing them by proximal terms to ensure the convergence.For the block-wise ADMM with Fortin and Glowinski's relaxation factor,its convergence and worst-case conver-gence rate measured by the iteration complexity in the ergodic sense are derived.