In this paper the Schwarz alternating method for a fourth-order elliptic variational inequality problem is considered by way of the equivalent form, and the geometric convergence is obtained on two subdomains.
The task of dividing corrupted-data into their respective subspaces can be well illustrated,both theoretically and numerically,by recovering low-rank and sparse-column components of a given matrix.Generally,it can be ...The task of dividing corrupted-data into their respective subspaces can be well illustrated,both theoretically and numerically,by recovering low-rank and sparse-column components of a given matrix.Generally,it can be characterized as a matrix and a 2,1-norm involved convex minimization problem.However,solving the resulting problem is full of challenges due to the non-smoothness of the objective function.One of the earliest solvers is an 3-block alternating direction method of multipliers(ADMM)which updates each variable in a Gauss-Seidel manner.In this paper,we present three variants of ADMM for the 3-block separable minimization problem.More preciously,whenever one variable is derived,the resulting problems can be regarded as a convex minimization with 2 blocks,and can be solved immediately using the standard ADMM.If the inner iteration loops only once,the iterative scheme reduces to the ADMM with updates in a Gauss-Seidel manner.If the solution from the inner iteration is assumed to be exact,the convergence can be deduced easily in the literature.The performance comparisons with a couple of recently designed solvers illustrate that the proposed methods are effective and competitive.展开更多
The alternating direction method of multipliers(ADMM)is one of the most successful and powerful methods for separable minimization optimization.Based on the idea of symmetric ADMM in two-block optimization,we add an u...The alternating direction method of multipliers(ADMM)is one of the most successful and powerful methods for separable minimization optimization.Based on the idea of symmetric ADMM in two-block optimization,we add an updating formula for the Lagrange multiplier without restricting its position for multiblock one.Then,combining with the Bregman distance,in this work,a Bregman-style partially symmetric ADMM is presented for nonconvex multi-block optimization with linear constraints,and the Lagrange multiplier is updated twice with different relaxation factors in the iteration scheme.Under the suitable conditions,the global convergence,strong convergence and convergence rate of the presented method are analyzed and obtained.Finally,some preliminary numerical results are reported to support the correctness of the theoretical assertions,and these show that the presented method is numerically effective.展开更多
Alternating directions method is one of the approaches for solving linearly constrained separate monotone variational inequalities. Experience on applications has shown that the number of iteration significantly depen...Alternating directions method is one of the approaches for solving linearly constrained separate monotone variational inequalities. Experience on applications has shown that the number of iteration significantly depends on the penalty for the system of linearly constrained equations and therefore the method with variable penalties is advantageous in practice. In this paper, we extend the Kontogiorgis and Meyer method [12] by removing the monotonicity assumption on the variable penalty matrices. Moreover, we introduce a self-adaptive rule that leads the method to be more efficient and insensitive for various initial penalties. Numerical results for a class of Fermat-Weber problems show that the modified method and its self-adaptive technique are proper and necessary in practice.展开更多
This work explores a family of two-block nonconvex optimization problems subject to linear constraints.We first introduce a simple but universal Bregman-style improved alternating direction method of multipliers(ADMM)...This work explores a family of two-block nonconvex optimization problems subject to linear constraints.We first introduce a simple but universal Bregman-style improved alternating direction method of multipliers(ADMM)based on the iteration framework of ADMM and the Bregman distance.Then,we utilize the smooth performance of one of the components to develop a linearized version of it.Compared to the traditional ADMM,both proposed methods integrate a convex combination strategy into the multiplier update step.For each proposed method,we demonstrate the convergence of the entire iteration sequence to a unique critical point of the augmented Lagrangian function utilizing the powerful Kurdyka–Łojasiewicz property,and we also derive convergence rates for both the sequence of merit function values and the iteration sequence.Finally,some numerical results show that the proposed methods are effective and encouraging for the Lasso model.展开更多
文摘In this paper the Schwarz alternating method for a fourth-order elliptic variational inequality problem is considered by way of the equivalent form, and the geometric convergence is obtained on two subdomains.
基金Supported by the National Natural Science Foundation of China(Grant No.11971149,11871381)Natural Science Foundation of Henan Province for Youth(Grant No.202300410146)。
文摘The task of dividing corrupted-data into their respective subspaces can be well illustrated,both theoretically and numerically,by recovering low-rank and sparse-column components of a given matrix.Generally,it can be characterized as a matrix and a 2,1-norm involved convex minimization problem.However,solving the resulting problem is full of challenges due to the non-smoothness of the objective function.One of the earliest solvers is an 3-block alternating direction method of multipliers(ADMM)which updates each variable in a Gauss-Seidel manner.In this paper,we present three variants of ADMM for the 3-block separable minimization problem.More preciously,whenever one variable is derived,the resulting problems can be regarded as a convex minimization with 2 blocks,and can be solved immediately using the standard ADMM.If the inner iteration loops only once,the iterative scheme reduces to the ADMM with updates in a Gauss-Seidel manner.If the solution from the inner iteration is assumed to be exact,the convergence can be deduced easily in the literature.The performance comparisons with a couple of recently designed solvers illustrate that the proposed methods are effective and competitive.
基金supported by the National Natural Science Foundation of China (No.12171106)the Natural Science Foundation of Guangxi Province (No.2020GXNSFDA238017)。
文摘The alternating direction method of multipliers(ADMM)is one of the most successful and powerful methods for separable minimization optimization.Based on the idea of symmetric ADMM in two-block optimization,we add an updating formula for the Lagrange multiplier without restricting its position for multiblock one.Then,combining with the Bregman distance,in this work,a Bregman-style partially symmetric ADMM is presented for nonconvex multi-block optimization with linear constraints,and the Lagrange multiplier is updated twice with different relaxation factors in the iteration scheme.Under the suitable conditions,the global convergence,strong convergence and convergence rate of the presented method are analyzed and obtained.Finally,some preliminary numerical results are reported to support the correctness of the theoretical assertions,and these show that the presented method is numerically effective.
基金The first author was supported the NSFC grant 10271054,the third author was supported in part by the Hong Kong Research Grants Council through a RGC-CERG Grant (HKUST6203/99E)
文摘Alternating directions method is one of the approaches for solving linearly constrained separate monotone variational inequalities. Experience on applications has shown that the number of iteration significantly depends on the penalty for the system of linearly constrained equations and therefore the method with variable penalties is advantageous in practice. In this paper, we extend the Kontogiorgis and Meyer method [12] by removing the monotonicity assumption on the variable penalty matrices. Moreover, we introduce a self-adaptive rule that leads the method to be more efficient and insensitive for various initial penalties. Numerical results for a class of Fermat-Weber problems show that the modified method and its self-adaptive technique are proper and necessary in practice.
基金the National Natural Science Foundation of China(Nos.12171106 and 72071202)the Natural Science Foundation of Guangxi Province(No.2020GXNSFDA238017)Key Laboratory of Mathematics and Engineering Applications,Ministry of Education.
文摘This work explores a family of two-block nonconvex optimization problems subject to linear constraints.We first introduce a simple but universal Bregman-style improved alternating direction method of multipliers(ADMM)based on the iteration framework of ADMM and the Bregman distance.Then,we utilize the smooth performance of one of the components to develop a linearized version of it.Compared to the traditional ADMM,both proposed methods integrate a convex combination strategy into the multiplier update step.For each proposed method,we demonstrate the convergence of the entire iteration sequence to a unique critical point of the augmented Lagrangian function utilizing the powerful Kurdyka–Łojasiewicz property,and we also derive convergence rates for both the sequence of merit function values and the iteration sequence.Finally,some numerical results show that the proposed methods are effective and encouraging for the Lasso model.