The Alternating Direction Multiplier Method (ADMM) is widely used in various fields, and different variables are customized in the literature for different application scenarios [1] [2] [3] [4]. Among them, the linear...The Alternating Direction Multiplier Method (ADMM) is widely used in various fields, and different variables are customized in the literature for different application scenarios [1] [2] [3] [4]. Among them, the linearized alternating direction multiplier method (LADMM) has received extensive attention because of its effectiveness and ease of implementation. This paper mainly discusses the application of ADMM in dictionary learning (non-convex problem). Many numerical experiments show that to achieve higher convergence accuracy, the convergence speed of ADMM is slower, especially near the optimal solution. Therefore, we introduce the linearized alternating direction multiplier method (LADMM) to accelerate the convergence speed of ADMM. Specifically, the problem is solved by linearizing the quadratic term of the subproblem, and the convergence of the algorithm is proved. Finally, there is a brief summary of the full text.展开更多
Linear scan computed tomography (LCT) is of great benefit to online industrial scanning and security inspection due to its characteristics of straight-line source trajectory and high scanning speed. However, in prac...Linear scan computed tomography (LCT) is of great benefit to online industrial scanning and security inspection due to its characteristics of straight-line source trajectory and high scanning speed. However, in practical applications of LCT, there are challenges to image reconstruction due to limited-angle and insufficient data. In this paper, a new reconstruction algorithm based on total-variation (TV) minimization is developed to reconstruct images from limited-angle and insufficient data in LCT. The main idea of our approach is to reformulate a TV problem as a linear equality constrained problem where the objective function is separable, and then minimize its augmented Lagrangian function by using alternating direction method (ADM) to solve subproblems. The proposed method is robust and efficient in the task of reconstruction by showing the convergence of ADM. The numerical simulations and real data reconstructions show that the proposed reconstruction method brings reasonable performance and outperforms some previous ones when applied to an LCT imaging problem.展开更多
We consider a wide range of non-convex regularized minimization problems, where the non-convex regularization term is composite with a linear function engaged in sparse learning. Recent theoretical investigations have...We consider a wide range of non-convex regularized minimization problems, where the non-convex regularization term is composite with a linear function engaged in sparse learning. Recent theoretical investigations have demonstrated their superiority over their convex counterparts. The computational challenge lies in the fact that the proximal mapping associated with non-convex regularization is not easily obtained due to the imposed linear composition. Fortunately, the problem structure allows one to introduce an auxiliary variable and reformulate it as an optimization problem with linear constraints, which can be solved using the Linearized Alternating Direction Method of Multipliers (LADMM). Despite the success of LADMM in practice, it remains unknown whether LADMM is convergent in solving such non-convex compositely regularized optimizations. In this research, we first present a detailed convergence analysis of the LADMM algorithm for solving a non-convex compositely regularized optimization problem with a large class of non-convex penalties. Furthermore, we propose an Adaptive LADMM (AdaLADMM) algorithm with a line-search criterion. Experimental results on different genres of datasets validate the efficacy of the proposed algorithm.展开更多
Linear programming is the core problem of various operational research problems.The dominant approaches for linear programming are simplex and interior point methods.In this paper,we showthat the alternating direction...Linear programming is the core problem of various operational research problems.The dominant approaches for linear programming are simplex and interior point methods.In this paper,we showthat the alternating direction method of multipliers(ADMM),which was proposed long time ago while recently found more and more applications in a broad spectrum of areas,can also be easily used to solve the canonical linear programming model.The resulting per-iteration complexity is O(mn)where m is the constraint number and n the variable dimension.At each iteration,there are m subproblems that are eligible for parallel computation;each requiring only O(n)flops.There is no inner iteration as well.We thus introduce the newADMMapproach to linear programming,which may inspire deeper research for more complicated scenarios with more sophisticated results.展开更多
In recent years,alternating direction method of multipliers(ADMM)and its variants are popular for the extensive use in image processing and statistical learning.A variant of ADMM:symmetric ADMM,which updates the Lagra...In recent years,alternating direction method of multipliers(ADMM)and its variants are popular for the extensive use in image processing and statistical learning.A variant of ADMM:symmetric ADMM,which updates the Lagrange mul-tiplier twice in one iteration,is always faster whenever it converges.In this paper,combined with Nesterov’s accelerating strategy,an accelerated symmetric ADMM is proposed.We prove its O(1/k^(2))convergence rate under strongly convex condition.For the general situation,an accelerated method with a restart rule is proposed.Some preliminary numerical experiments show the efficiency of our algorithms.展开更多
文摘The Alternating Direction Multiplier Method (ADMM) is widely used in various fields, and different variables are customized in the literature for different application scenarios [1] [2] [3] [4]. Among them, the linearized alternating direction multiplier method (LADMM) has received extensive attention because of its effectiveness and ease of implementation. This paper mainly discusses the application of ADMM in dictionary learning (non-convex problem). Many numerical experiments show that to achieve higher convergence accuracy, the convergence speed of ADMM is slower, especially near the optimal solution. Therefore, we introduce the linearized alternating direction multiplier method (LADMM) to accelerate the convergence speed of ADMM. Specifically, the problem is solved by linearizing the quadratic term of the subproblem, and the convergence of the algorithm is proved. Finally, there is a brief summary of the full text.
基金the National High Technology Research and Development Program of China(Grant No.2012AA011603)
文摘Linear scan computed tomography (LCT) is of great benefit to online industrial scanning and security inspection due to its characteristics of straight-line source trajectory and high scanning speed. However, in practical applications of LCT, there are challenges to image reconstruction due to limited-angle and insufficient data. In this paper, a new reconstruction algorithm based on total-variation (TV) minimization is developed to reconstruct images from limited-angle and insufficient data in LCT. The main idea of our approach is to reformulate a TV problem as a linear equality constrained problem where the objective function is separable, and then minimize its augmented Lagrangian function by using alternating direction method (ADM) to solve subproblems. The proposed method is robust and efficient in the task of reconstruction by showing the convergence of ADM. The numerical simulations and real data reconstructions show that the proposed reconstruction method brings reasonable performance and outperforms some previous ones when applied to an LCT imaging problem.
基金supported by the National Natural Science Foundation of China(Nos.61303264,61202482,and 61202488)Guangxi Cooperative Innovation Center of Cloud Computing and Big Data(No.YD16505)Distinguished Young Scientist Promotion of National University of Defense Technology
文摘We consider a wide range of non-convex regularized minimization problems, where the non-convex regularization term is composite with a linear function engaged in sparse learning. Recent theoretical investigations have demonstrated their superiority over their convex counterparts. The computational challenge lies in the fact that the proximal mapping associated with non-convex regularization is not easily obtained due to the imposed linear composition. Fortunately, the problem structure allows one to introduce an auxiliary variable and reformulate it as an optimization problem with linear constraints, which can be solved using the Linearized Alternating Direction Method of Multipliers (LADMM). Despite the success of LADMM in practice, it remains unknown whether LADMM is convergent in solving such non-convex compositely regularized optimizations. In this research, we first present a detailed convergence analysis of the LADMM algorithm for solving a non-convex compositely regularized optimization problem with a large class of non-convex penalties. Furthermore, we propose an Adaptive LADMM (AdaLADMM) algorithm with a line-search criterion. Experimental results on different genres of datasets validate the efficacy of the proposed algorithm.
基金Bing-Sheng He was supported by the National Natural Science Foundation of China(No.11471156)Xiao-Ming Yuan was supported by the General Research Fund from Hong Kong Research Grants Council(No.12302514).
文摘Linear programming is the core problem of various operational research problems.The dominant approaches for linear programming are simplex and interior point methods.In this paper,we showthat the alternating direction method of multipliers(ADMM),which was proposed long time ago while recently found more and more applications in a broad spectrum of areas,can also be easily used to solve the canonical linear programming model.The resulting per-iteration complexity is O(mn)where m is the constraint number and n the variable dimension.At each iteration,there are m subproblems that are eligible for parallel computation;each requiring only O(n)flops.There is no inner iteration as well.We thus introduce the newADMMapproach to linear programming,which may inspire deeper research for more complicated scenarios with more sophisticated results.
基金This research is partly supported by the National Natural Sci-ence Foundation of China(Grant No.11671217)Natural Science Foundation of Xinjiang(Grant No.2017D01A14)。
文摘In recent years,alternating direction method of multipliers(ADMM)and its variants are popular for the extensive use in image processing and statistical learning.A variant of ADMM:symmetric ADMM,which updates the Lagrange mul-tiplier twice in one iteration,is always faster whenever it converges.In this paper,combined with Nesterov’s accelerating strategy,an accelerated symmetric ADMM is proposed.We prove its O(1/k^(2))convergence rate under strongly convex condition.For the general situation,an accelerated method with a restart rule is proposed.Some preliminary numerical experiments show the efficiency of our algorithms.