In this paper, we consider the convergence properties of a new class of three terms conjugate gradient methods with generalized Armijo step size rule for minimizing a continuously differentiable function f on R^π wit...In this paper, we consider the convergence properties of a new class of three terms conjugate gradient methods with generalized Armijo step size rule for minimizing a continuously differentiable function f on R^π without assuming that the sequence {xk} of iterates is bounded. We prove that the limit infimum of ‖↓△f(xk)‖ is Zero. Moreover, we prove that, when f(x) is pseudo-convex (quasi-convex) function, this new method has strong convergence results: either xk→x* and x* is a minimizer (stationary point); or ‖xk‖→∞, arg min{f(x) :x∈R^n} =φ, and.f(xk) ↓ inf(f(x) : x∈R^n}. Combining FR, PR, HS methods with our new method, FR, PR, HS methods are modified to have global convergence property.Numerical result show that the new algorithms are efficient by comparing with FR,PR, HS conjugate gradient methods with Armijo step size rule.展开更多
In this paper, the continuously differentiable optimization problem min{f(x) : x∈Ω}, where Ω ∈ R^n is a nonempty closed convex set, the gradient projection method by Calamai and More (Math. Programming, Vol.39...In this paper, the continuously differentiable optimization problem min{f(x) : x∈Ω}, where Ω ∈ R^n is a nonempty closed convex set, the gradient projection method by Calamai and More (Math. Programming, Vol.39. P.93-116, 1987) is modified by memory gradient to improve the convergence rate of the gradient projection method is considered. The convergence of the new method is analyzed without assuming that the iteration sequence {x^k} of bounded. Moreover, it is shown that, when f(x) is pseudo-convex (quasiconvex) function, this new method has strong convergence results. The numerical results show that the method in this paper is more effective than the gradient projection method.展开更多
文摘In this paper, we consider the convergence properties of a new class of three terms conjugate gradient methods with generalized Armijo step size rule for minimizing a continuously differentiable function f on R^π without assuming that the sequence {xk} of iterates is bounded. We prove that the limit infimum of ‖↓△f(xk)‖ is Zero. Moreover, we prove that, when f(x) is pseudo-convex (quasi-convex) function, this new method has strong convergence results: either xk→x* and x* is a minimizer (stationary point); or ‖xk‖→∞, arg min{f(x) :x∈R^n} =φ, and.f(xk) ↓ inf(f(x) : x∈R^n}. Combining FR, PR, HS methods with our new method, FR, PR, HS methods are modified to have global convergence property.Numerical result show that the new algorithms are efficient by comparing with FR,PR, HS conjugate gradient methods with Armijo step size rule.
基金Supported by the National Natural Science Foundation of China(No.10571106).
文摘In this paper, the continuously differentiable optimization problem min{f(x) : x∈Ω}, where Ω ∈ R^n is a nonempty closed convex set, the gradient projection method by Calamai and More (Math. Programming, Vol.39. P.93-116, 1987) is modified by memory gradient to improve the convergence rate of the gradient projection method is considered. The convergence of the new method is analyzed without assuming that the iteration sequence {x^k} of bounded. Moreover, it is shown that, when f(x) is pseudo-convex (quasiconvex) function, this new method has strong convergence results. The numerical results show that the method in this paper is more effective than the gradient projection method.