期刊文献+
共找到2篇文章
< 1 >
每页显示 20 50 100
Weak and Strong Convergence of Self Adaptive Inertial Subgradient Extragradient Algorithms for Solving Variational Inequality Problems
1
作者 Yao Li Hongwei Liu Jiamin Lv 《Journal of Harbin Institute of Technology(New Series)》 CAS 2024年第2期38-49,共12页
Many solutions of variational inequalities have been proposed,among which the subgradient extragradient method has obvious advantages.Two different algorithms are given for solving variational inequality problem in th... Many solutions of variational inequalities have been proposed,among which the subgradient extragradient method has obvious advantages.Two different algorithms are given for solving variational inequality problem in this paper.The problem we study is defined in a real Hilbert space and has L-Lipschitz and pseudomonotone condition.Two new algorithms adopt inertial technology and non-monotonic step size rule,and their convergence can still be proved when the value of L is not given in advance.Finally,some numerical results are designed to demonstrate the computational efficiency of our two new algorithms. 展开更多
关键词 variational inequality inertial method non-monotonic step size rule Lipschitz continuity pseudomonotone mapping
下载PDF
Global Convergence of a Modified Gradient Projection Method for Convex Constrained Problems 被引量:1
2
作者 Qing-ying Sun Chang-yu Wang Zhen-jun Shi 《Acta Mathematicae Applicatae Sinica》 SCIE CSCD 2006年第2期227-242,共16页
In this paper, the continuously differentiable optimization problem min{f(x) : x∈Ω}, where Ω ∈ R^n is a nonempty closed convex set, the gradient projection method by Calamai and More (Math. Programming, Vol.39... In this paper, the continuously differentiable optimization problem min{f(x) : x∈Ω}, where Ω ∈ R^n is a nonempty closed convex set, the gradient projection method by Calamai and More (Math. Programming, Vol.39. P.93-116, 1987) is modified by memory gradient to improve the convergence rate of the gradient projection method is considered. The convergence of the new method is analyzed without assuming that the iteration sequence {x^k} of bounded. Moreover, it is shown that, when f(x) is pseudo-convex (quasiconvex) function, this new method has strong convergence results. The numerical results show that the method in this paper is more effective than the gradient projection method. 展开更多
关键词 Nonlinear programming PROJECTION generalized Armijo step size rule CONVERGENCE
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部