期刊文献+
共找到2篇文章
< 1 >
每页显示 20 50 100
SEMI-PROXIMAL POINT METHOD FOR NONSMOOTH CONVEX-CONCAVE MINIMAX OPTIMIZATION
1
作者 Yuhong Dai Jiani Wang Liwei Zhang 《Journal of Computational Mathematics》 SCIE CSCD 2024年第3期617-637,共21页
Minimax optimization problems are an important class of optimization problems arising from modern machine learning and traditional research areas.While there have been many numerical algorithms for solving smooth conv... Minimax optimization problems are an important class of optimization problems arising from modern machine learning and traditional research areas.While there have been many numerical algorithms for solving smooth convex-concave minimax problems,numerical algorithms for nonsmooth convex-concave minimax problems are rare.This paper aims to develop an efficient numerical algorithm for a structured nonsmooth convex-concave minimax problem.A semi-proximal point method(SPP)is proposed,in which a quadratic convex-concave function is adopted for approximating the smooth part of the objective function and semi-proximal terms are added in each subproblem.This construction enables the subproblems at each iteration are solvable and even easily solved when the semiproximal terms are cleverly chosen.We prove the global convergence of our algorithm under mild assumptions,without requiring strong convexity-concavity condition.Under the locally metrical subregularity of the solution mapping,we prove that our algorithm has the linear rate of convergence.Preliminary numerical results are reported to verify the efficiency of our algorithm. 展开更多
关键词 Minimax optimization Convexity-concavity Global convergence Rate of con-vergence Locally metrical subregularity
原文传递
Augmented Lagrangian Methods for Convex Matrix Optimization Problems
2
作者 Ying Cui Chao Ding +1 位作者 Xu-Dong Li Xin-Yuan Zhao 《Journal of the Operations Research Society of China》 EI CSCD 2022年第2期305-342,共38页
In this paper,we provide some gentle introductions to the recent advance in augmented Lagrangian methods for solving large-scale convex matrix optimization problems(cMOP).Specifically,we reviewed two types of sufficie... In this paper,we provide some gentle introductions to the recent advance in augmented Lagrangian methods for solving large-scale convex matrix optimization problems(cMOP).Specifically,we reviewed two types of sufficient conditions for ensuring the quadratic growth conditions of a class of constrained convex matrix optimization problems regularized by nonsmooth spectral functions.Under a mild quadratic growth condition on the dual of cMOP,we further discussed the R-superlinear convergence of the Karush-Kuhn-Tucker(KKT)residuals of the sequence generated by the augmented Lagrangian methods(ALM)for solving convex matrix optimization problems.Implementation details of the ALM for solving core convex matrix optimization problems are also provided. 展开更多
关键词 Matrix optimization Spectral functions Quadratic growth conditions metric subregularity Augmented Lagrangian methods Fast convergence rates Semismooth Newton methods
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部