Many solutions of variational inequalities have been proposed,among which the subgradient extragradient method has obvious advantages.Two different algorithms are given for solving variational inequality problem in th...Many solutions of variational inequalities have been proposed,among which the subgradient extragradient method has obvious advantages.Two different algorithms are given for solving variational inequality problem in this paper.The problem we study is defined in a real Hilbert space and has L-Lipschitz and pseudomonotone condition.Two new algorithms adopt inertial technology and non-monotonic step size rule,and their convergence can still be proved when the value of L is not given in advance.Finally,some numerical results are designed to demonstrate the computational efficiency of our two new algorithms.展开更多
In this paper, the continuously differentiable optimization problem min{f(x) : x∈Ω}, where Ω ∈ R^n is a nonempty closed convex set, the gradient projection method by Calamai and More (Math. Programming, Vol.39...In this paper, the continuously differentiable optimization problem min{f(x) : x∈Ω}, where Ω ∈ R^n is a nonempty closed convex set, the gradient projection method by Calamai and More (Math. Programming, Vol.39. P.93-116, 1987) is modified by memory gradient to improve the convergence rate of the gradient projection method is considered. The convergence of the new method is analyzed without assuming that the iteration sequence {x^k} of bounded. Moreover, it is shown that, when f(x) is pseudo-convex (quasiconvex) function, this new method has strong convergence results. The numerical results show that the method in this paper is more effective than the gradient projection method.展开更多
文摘Many solutions of variational inequalities have been proposed,among which the subgradient extragradient method has obvious advantages.Two different algorithms are given for solving variational inequality problem in this paper.The problem we study is defined in a real Hilbert space and has L-Lipschitz and pseudomonotone condition.Two new algorithms adopt inertial technology and non-monotonic step size rule,and their convergence can still be proved when the value of L is not given in advance.Finally,some numerical results are designed to demonstrate the computational efficiency of our two new algorithms.
基金Supported by the National Natural Science Foundation of China(No.10571106).
文摘In this paper, the continuously differentiable optimization problem min{f(x) : x∈Ω}, where Ω ∈ R^n is a nonempty closed convex set, the gradient projection method by Calamai and More (Math. Programming, Vol.39. P.93-116, 1987) is modified by memory gradient to improve the convergence rate of the gradient projection method is considered. The convergence of the new method is analyzed without assuming that the iteration sequence {x^k} of bounded. Moreover, it is shown that, when f(x) is pseudo-convex (quasiconvex) function, this new method has strong convergence results. The numerical results show that the method in this paper is more effective than the gradient projection method.