A new convergence theorem for the Secant method in Banach spaces based on new recurrence relations is established for approximating a solution of a nonlinear operator equation. It is assumed that the divided differenc...A new convergence theorem for the Secant method in Banach spaces based on new recurrence relations is established for approximating a solution of a nonlinear operator equation. It is assumed that the divided difference of order one of the nonlinear operator is Lipschitz continuous. The convergence conditions differ from some existing ones and are easily satisfied. The results of the paper are justified by numerical examples that cannot be handled by earlier works.展开更多
In this paper, a new weak condition for the convergence of secant method to solve the systems of nonlinear equations is proposed. A convergence ball with the center x0 is replaced by that with xl, the first approximat...In this paper, a new weak condition for the convergence of secant method to solve the systems of nonlinear equations is proposed. A convergence ball with the center x0 is replaced by that with xl, the first approximation generated by the secant method with the initial data x-1 and x0. Under the bounded conditions of the divided difference, a convergence theorem is obtained and two examples to illustrate the weakness of convergence conditions are provided. Moreover, the secant method is applied to a system of nonlinear equations to demonstrate the viability and effectiveness of the results in the paper.展开更多
Under the assumption that the nonlinear operator has Lipschitz continuous divided differences for the first order,we obtain an estimate of the radius of the convergence ball for the two-step secant method.Moreover,we ...Under the assumption that the nonlinear operator has Lipschitz continuous divided differences for the first order,we obtain an estimate of the radius of the convergence ball for the two-step secant method.Moreover,we also provide an error estimate that matches the convergence order of the two-step secant method.At last,we give an application of the proposed theorem.展开更多
A local convergence theorem and five semi-local convergence theorems of the secant method are listed in this paper. For every convergence theorem, a convergence ball is respectively introduced, where the hypothesis co...A local convergence theorem and five semi-local convergence theorems of the secant method are listed in this paper. For every convergence theorem, a convergence ball is respectively introduced, where the hypothesis conditions of the corresponding theorem can be satisfied. Since all of these convergence balls have the same center x^*, they can be viewed as a homocentric ball. Convergence theorems are sorted by the different sizes of various radii of this homocentric ball, and the sorted sequence represents the degree of weakness on the conditions of convergence theorems.展开更多
The secant methods discussed by Fontecilla (in 1988) are considerably revised through employing a trust region multiplier strategy and introducing a nondifferentiable merit function. In this paper the secant methods a...The secant methods discussed by Fontecilla (in 1988) are considerably revised through employing a trust region multiplier strategy and introducing a nondifferentiable merit function. In this paper the secant methods are also improved by adding a dogleg typed movement which allows to overcome a phenomena similar to the Maratos effect. Furthermore, these algorithms are analyzed and global convergence theorems as well as local superlinear convergence rate are proved.展开更多
This paper studies a family of the local convergence of the improved secant methods for solving the nonlinear equality constrained optimization subject to bounds on variables. The Hessian of the Lagrangian is approxim...This paper studies a family of the local convergence of the improved secant methods for solving the nonlinear equality constrained optimization subject to bounds on variables. The Hessian of the Lagrangian is approximated using the DFP or the BFGS secant updates. The improved secant methods are used to generate a search direction. Combining with a suitable step size, each iterate switches to trial step of strict interior feasibility. When the Hessian is only positive definite in an affine null subspace, one shows that the algorithms generate the sequences converging q-linearly and two-step q-superlinearly. Yhrthermore, under some suitable assumptions, some sequences generated by the algorithms converge locally one-step q-superlinearly. Finally, some numerical results are presented to illustrate the effectiveness of the proposed algorithms.展开更多
In this paper, we present m time secant like multi projection algorithm for sparse unconstrained minimization problem. We prove this method are all q superlinearly convergent to the solution about m≥1 . At last, we f...In this paper, we present m time secant like multi projection algorithm for sparse unconstrained minimization problem. We prove this method are all q superlinearly convergent to the solution about m≥1 . At last, we from some numerical results, discuss how to choose the number m to determine the approximating matrix properly in practical use.展开更多
In this paper, we discuss the relationship between the sparse symmetric Broyden (SPSB) method [1, 2] and m-time secant-like multi-projection (SMP) method [3] and prove that when m goes to infinity, the SMP method is c...In this paper, we discuss the relationship between the sparse symmetric Broyden (SPSB) method [1, 2] and m-time secant-like multi-projection (SMP) method [3] and prove that when m goes to infinity, the SMP method is corresponding to the SPSB method.展开更多
In this paper, we provide and analyze a new scaled conjugate gradient method and its performance, based on the modified secant equation of the Broyden-Fletcher-Goldfarb-Shanno (BFGS) method and on a new modified nonmo...In this paper, we provide and analyze a new scaled conjugate gradient method and its performance, based on the modified secant equation of the Broyden-Fletcher-Goldfarb-Shanno (BFGS) method and on a new modified nonmonotone line search technique. The method incorporates the modified BFGS secant equation in an effort to include the second order information of the objective function. The new secant equation has both gradient and function value information, and its update formula inherits the positive definiteness of Hessian approximation for general convex function. In order to improve the likelihood of finding a global optimal solution, we introduce a new modified nonmonotone line search technique. It is shown that, for nonsmooth convex problems, the proposed algorithm is globally convergent. Numerical results show that this new scaled conjugate gradient algorithm is promising and efficient for solving not only convex but also some large scale nonsmooth nonconvex problems in the sense of the Dolan-Moré performance profiles.展开更多
Newton’s method is used to find the roots of a system of equations <span style="white-space:nowrap;"><em>f</em> (x) = 0</span>. It is one of the most important procedures in numerica...Newton’s method is used to find the roots of a system of equations <span style="white-space:nowrap;"><em>f</em> (x) = 0</span>. It is one of the most important procedures in numerical analysis, and its applicability extends to differential equations and integral equations. Analysis of the method shows a quadratic convergence under certain assumptions. For several years, researchers have improved the method by proposing modified Newton methods with salutary efforts. A modification of the Newton’s method was proposed by McDougall and Wotherspoon <a href="#ref1">[1]</a> with an order of convergence of <span style="white-space:nowrap;">1+ <span style="white-space:nowrap;">√2</span></span>. On a new type of methods with cubic convergence was proposed by H. H. H. Homeier <a href="#ref2">[2]</a>. In this article, we present a new modification of Newton method based on secant method. Analysis of convergence shows that the new method is cubically convergent. Our method requires an evaluation of the function and one of its derivatives.展开更多
基金Supported by the National Natural Science Foundation of China (10871178)the Natural Science Foundation of Zhejiang Province of China (Y606154)Foundation of the Education Department of Zhejiang Province of China (20071362)
文摘A new convergence theorem for the Secant method in Banach spaces based on new recurrence relations is established for approximating a solution of a nonlinear operator equation. It is assumed that the divided difference of order one of the nonlinear operator is Lipschitz continuous. The convergence conditions differ from some existing ones and are easily satisfied. The results of the paper are justified by numerical examples that cannot be handled by earlier works.
基金Supported by the Qianjiang Rencai Project Foundation of Zhejiang Province (J20070288)
文摘In this paper, a new weak condition for the convergence of secant method to solve the systems of nonlinear equations is proposed. A convergence ball with the center x0 is replaced by that with xl, the first approximation generated by the secant method with the initial data x-1 and x0. Under the bounded conditions of the divided difference, a convergence theorem is obtained and two examples to illustrate the weakness of convergence conditions are provided. Moreover, the secant method is applied to a system of nonlinear equations to demonstrate the viability and effectiveness of the results in the paper.
基金supported by National Natural Science Foundation of China(11771393,11371320,11632015)Zhejiang Natural Science Foundation(LZ14A010002,LQ18A010008)Scientific Research Fund of Zhejiang Provincial Education Department(FX2016073)
文摘Under the assumption that the nonlinear operator has Lipschitz continuous divided differences for the first order,we obtain an estimate of the radius of the convergence ball for the two-step secant method.Moreover,we also provide an error estimate that matches the convergence order of the two-step secant method.At last,we give an application of the proposed theorem.
文摘A local convergence theorem and five semi-local convergence theorems of the secant method are listed in this paper. For every convergence theorem, a convergence ball is respectively introduced, where the hypothesis conditions of the corresponding theorem can be satisfied. Since all of these convergence balls have the same center x^*, they can be viewed as a homocentric ball. Convergence theorems are sorted by the different sizes of various radii of this homocentric ball, and the sorted sequence represents the degree of weakness on the conditions of convergence theorems.
基金Supported by Science and Technology Foundation of Shanghai Higher Education
文摘The secant methods discussed by Fontecilla (in 1988) are considerably revised through employing a trust region multiplier strategy and introducing a nondifferentiable merit function. In this paper the secant methods are also improved by adding a dogleg typed movement which allows to overcome a phenomena similar to the Maratos effect. Furthermore, these algorithms are analyzed and global convergence theorems as well as local superlinear convergence rate are proved.
基金supported by the partial supports of the National Science Foundation under Grant No.10871130the Ph.D. Foundation under Grant No.20093127110005 of Chinese Education Ministry
文摘This paper studies a family of the local convergence of the improved secant methods for solving the nonlinear equality constrained optimization subject to bounds on variables. The Hessian of the Lagrangian is approximated using the DFP or the BFGS secant updates. The improved secant methods are used to generate a search direction. Combining with a suitable step size, each iterate switches to trial step of strict interior feasibility. When the Hessian is only positive definite in an affine null subspace, one shows that the algorithms generate the sequences converging q-linearly and two-step q-superlinearly. Yhrthermore, under some suitable assumptions, some sequences generated by the algorithms converge locally one-step q-superlinearly. Finally, some numerical results are presented to illustrate the effectiveness of the proposed algorithms.
文摘In this paper, we present m time secant like multi projection algorithm for sparse unconstrained minimization problem. We prove this method are all q superlinearly convergent to the solution about m≥1 . At last, we from some numerical results, discuss how to choose the number m to determine the approximating matrix properly in practical use.
文摘In this paper, we discuss the relationship between the sparse symmetric Broyden (SPSB) method [1, 2] and m-time secant-like multi-projection (SMP) method [3] and prove that when m goes to infinity, the SMP method is corresponding to the SPSB method.
文摘In this paper, we provide and analyze a new scaled conjugate gradient method and its performance, based on the modified secant equation of the Broyden-Fletcher-Goldfarb-Shanno (BFGS) method and on a new modified nonmonotone line search technique. The method incorporates the modified BFGS secant equation in an effort to include the second order information of the objective function. The new secant equation has both gradient and function value information, and its update formula inherits the positive definiteness of Hessian approximation for general convex function. In order to improve the likelihood of finding a global optimal solution, we introduce a new modified nonmonotone line search technique. It is shown that, for nonsmooth convex problems, the proposed algorithm is globally convergent. Numerical results show that this new scaled conjugate gradient algorithm is promising and efficient for solving not only convex but also some large scale nonsmooth nonconvex problems in the sense of the Dolan-Moré performance profiles.
文摘Newton’s method is used to find the roots of a system of equations <span style="white-space:nowrap;"><em>f</em> (x) = 0</span>. It is one of the most important procedures in numerical analysis, and its applicability extends to differential equations and integral equations. Analysis of the method shows a quadratic convergence under certain assumptions. For several years, researchers have improved the method by proposing modified Newton methods with salutary efforts. A modification of the Newton’s method was proposed by McDougall and Wotherspoon <a href="#ref1">[1]</a> with an order of convergence of <span style="white-space:nowrap;">1+ <span style="white-space:nowrap;">√2</span></span>. On a new type of methods with cubic convergence was proposed by H. H. H. Homeier <a href="#ref2">[2]</a>. In this article, we present a new modification of Newton method based on secant method. Analysis of convergence shows that the new method is cubically convergent. Our method requires an evaluation of the function and one of its derivatives.