Minimax optimization problems are an important class of optimization problems arising from modern machine learning and traditional research areas.While there have been many numerical algorithms for solving smooth conv...Minimax optimization problems are an important class of optimization problems arising from modern machine learning and traditional research areas.While there have been many numerical algorithms for solving smooth convex-concave minimax problems,numerical algorithms for nonsmooth convex-concave minimax problems are rare.This paper aims to develop an efficient numerical algorithm for a structured nonsmooth convex-concave minimax problem.A semi-proximal point method(SPP)is proposed,in which a quadratic convex-concave function is adopted for approximating the smooth part of the objective function and semi-proximal terms are added in each subproblem.This construction enables the subproblems at each iteration are solvable and even easily solved when the semiproximal terms are cleverly chosen.We prove the global convergence of our algorithm under mild assumptions,without requiring strong convexity-concavity condition.Under the locally metrical subregularity of the solution mapping,we prove that our algorithm has the linear rate of convergence.Preliminary numerical results are reported to verify the efficiency of our algorithm.展开更多
In this paper,we provide some gentle introductions to the recent advance in augmented Lagrangian methods for solving large-scale convex matrix optimization problems(cMOP).Specifically,we reviewed two types of sufficie...In this paper,we provide some gentle introductions to the recent advance in augmented Lagrangian methods for solving large-scale convex matrix optimization problems(cMOP).Specifically,we reviewed two types of sufficient conditions for ensuring the quadratic growth conditions of a class of constrained convex matrix optimization problems regularized by nonsmooth spectral functions.Under a mild quadratic growth condition on the dual of cMOP,we further discussed the R-superlinear convergence of the Karush-Kuhn-Tucker(KKT)residuals of the sequence generated by the augmented Lagrangian methods(ALM)for solving convex matrix optimization problems.Implementation details of the ALM for solving core convex matrix optimization problems are also provided.展开更多
基金supported by the Natural Science Foundation of China(Grant Nos.11991021,11991020,12021001,11971372,11971089,11731013)by the Strategic Priority Research Program of Chinese Academy of Sciences(Grant No.XDA27000000)by the National Key R&D Program of China(Grant Nos.2021YFA1000300,2021YFA1000301).
文摘Minimax optimization problems are an important class of optimization problems arising from modern machine learning and traditional research areas.While there have been many numerical algorithms for solving smooth convex-concave minimax problems,numerical algorithms for nonsmooth convex-concave minimax problems are rare.This paper aims to develop an efficient numerical algorithm for a structured nonsmooth convex-concave minimax problem.A semi-proximal point method(SPP)is proposed,in which a quadratic convex-concave function is adopted for approximating the smooth part of the objective function and semi-proximal terms are added in each subproblem.This construction enables the subproblems at each iteration are solvable and even easily solved when the semiproximal terms are cleverly chosen.We prove the global convergence of our algorithm under mild assumptions,without requiring strong convexity-concavity condition.Under the locally metrical subregularity of the solution mapping,we prove that our algorithm has the linear rate of convergence.Preliminary numerical results are reported to verify the efficiency of our algorithm.
基金Chao Ding’s research was supported by the National Natural Science Foundation of China(Nos.11671387,11531014,and 11688101)Beijing Natural Science Foundation(No.Z190002)+6 种基金Xu-Dong Li’s research was supported by the National Key R&D Program of China(No.2020YFA0711900)the National Natural Science Foundation of China(No.11901107)the Young Elite Scientists Sponsorship Program by CAST(No.2019QNRC001)the Shanghai Sailing Program(No.19YF1402600)the Science and Technology Commission of Shanghai Municipality Project(No.19511120700)Xin-Yuan Zhao’s research was supported by the National Natural Science Foundation of China(No.11871002)the General Program of Science and Technology of Beijing Municipal Education Commission(No.KM201810005004).
文摘In this paper,we provide some gentle introductions to the recent advance in augmented Lagrangian methods for solving large-scale convex matrix optimization problems(cMOP).Specifically,we reviewed two types of sufficient conditions for ensuring the quadratic growth conditions of a class of constrained convex matrix optimization problems regularized by nonsmooth spectral functions.Under a mild quadratic growth condition on the dual of cMOP,we further discussed the R-superlinear convergence of the Karush-Kuhn-Tucker(KKT)residuals of the sequence generated by the augmented Lagrangian methods(ALM)for solving convex matrix optimization problems.Implementation details of the ALM for solving core convex matrix optimization problems are also provided.