Many machine learning problems can be formulated as minimizing the sum of a function and a non-smooth regularization term.Proximal stochastic gradient methods are popular for solving such composite optimization proble...Many machine learning problems can be formulated as minimizing the sum of a function and a non-smooth regularization term.Proximal stochastic gradient methods are popular for solving such composite optimization problems.We propose a minibatch proximal stochastic recursive gradient algorithm SRG-DBB,which incorporates the diagonal Barzilai–Borwein(DBB)stepsize strategy to capture the local geometry of the problem.The linear convergence and complexity of SRG-DBB are analyzed for strongly convex functions.We further establish the linear convergence of SRGDBB under the non-strong convexity condition.Moreover,it is proved that SRG-DBB converges sublinearly in the convex case.Numerical experiments on standard data sets indicate that the performance of SRG-DBB is better than or comparable to the proximal stochastic recursive gradient algorithm with best-tuned scalar stepsizes or BB stepsizes.Furthermore,SRG-DBB is superior to some advanced mini-batch proximal stochastic gradient methods.展开更多
In this paper,by designing a normalized nonmonotone search strategy with the BarzilaiBorwein-type step-size,a novel local minimax method(LMM),which is a globally convergent iterative method,is proposed and analyzed to...In this paper,by designing a normalized nonmonotone search strategy with the BarzilaiBorwein-type step-size,a novel local minimax method(LMM),which is a globally convergent iterative method,is proposed and analyzed to find multiple(unstable)saddle points of nonconvex functionals in Hilbert spaces.Compared to traditional LMMs with monotone search strategies,this approach,which does not require strict decrease of the objective functional value at each iterative step,is observed to converge faster with less computations.Firstly,based on a normalized iterative scheme coupled with a local peak selection that pulls the iterative point back onto the solution submanifold,by generalizing the Zhang-Hager(ZH)search strategy in the optimization theory to the LMM framework,a kind of normalized ZH-type nonmonotone step-size search strategy is introduced,and then a novel nonmonotone LMM is constructed.Its feasibility and global convergence results are rigorously carried out under the relaxation of the monotonicity for the functional at the iterative sequences.Secondly,in order to speed up the convergence of the nonmonotone LMM,a globally convergent Barzilai-Borwein-type LMM(GBBLMM)is presented by explicitly constructing the Barzilai-Borwein-type step-size as a trial step-size of the normalized ZH-type nonmonotone step-size search strategy in each iteration.Finally,the GBBLMM algorithm is implemented to find multiple unstable solutions of two classes of semilinear elliptic boundary value problems with variational structures:one is the semilinear elliptic equations with the homogeneous Dirichlet boundary condition and another is the linear elliptic equations with semilinear Neumann boundary conditions.Extensive numerical results indicate that our approach is very effective and speeds up the LMMs significantly.展开更多
基金the National Natural Science Foundation of China(Nos.11671116,11701137,12071108,11991020,11991021 and 12021001)the Major Research Plan of the NSFC(No.91630202)+1 种基金the Strategic Priority Research Program of Chinese Academy of Sciences(No.XDA27000000)the Natural Science Foundation of Hebei Province(No.A2021202010)。
文摘Many machine learning problems can be formulated as minimizing the sum of a function and a non-smooth regularization term.Proximal stochastic gradient methods are popular for solving such composite optimization problems.We propose a minibatch proximal stochastic recursive gradient algorithm SRG-DBB,which incorporates the diagonal Barzilai–Borwein(DBB)stepsize strategy to capture the local geometry of the problem.The linear convergence and complexity of SRG-DBB are analyzed for strongly convex functions.We further establish the linear convergence of SRGDBB under the non-strong convexity condition.Moreover,it is proved that SRG-DBB converges sublinearly in the convex case.Numerical experiments on standard data sets indicate that the performance of SRG-DBB is better than or comparable to the proximal stochastic recursive gradient algorithm with best-tuned scalar stepsizes or BB stepsizes.Furthermore,SRG-DBB is superior to some advanced mini-batch proximal stochastic gradient methods.
基金supported by the NSFC(Grant Nos.12171148,11771138)the NSFC(Grant Nos.12101252,11971007)+2 种基金the NSFC(Grant No.11901185)the National Key R&D Program of China(Grant No.2021YFA1001300)by the Fundamental Research Funds for the Central Universities(Grant No.531118010207).
文摘In this paper,by designing a normalized nonmonotone search strategy with the BarzilaiBorwein-type step-size,a novel local minimax method(LMM),which is a globally convergent iterative method,is proposed and analyzed to find multiple(unstable)saddle points of nonconvex functionals in Hilbert spaces.Compared to traditional LMMs with monotone search strategies,this approach,which does not require strict decrease of the objective functional value at each iterative step,is observed to converge faster with less computations.Firstly,based on a normalized iterative scheme coupled with a local peak selection that pulls the iterative point back onto the solution submanifold,by generalizing the Zhang-Hager(ZH)search strategy in the optimization theory to the LMM framework,a kind of normalized ZH-type nonmonotone step-size search strategy is introduced,and then a novel nonmonotone LMM is constructed.Its feasibility and global convergence results are rigorously carried out under the relaxation of the monotonicity for the functional at the iterative sequences.Secondly,in order to speed up the convergence of the nonmonotone LMM,a globally convergent Barzilai-Borwein-type LMM(GBBLMM)is presented by explicitly constructing the Barzilai-Borwein-type step-size as a trial step-size of the normalized ZH-type nonmonotone step-size search strategy in each iteration.Finally,the GBBLMM algorithm is implemented to find multiple unstable solutions of two classes of semilinear elliptic boundary value problems with variational structures:one is the semilinear elliptic equations with the homogeneous Dirichlet boundary condition and another is the linear elliptic equations with semilinear Neumann boundary conditions.Extensive numerical results indicate that our approach is very effective and speeds up the LMMs significantly.