As a generalization of the two-term conjugate gradient method(CGM),the spectral CGM is one of the effective methods for solving unconstrained optimization.In this paper,we enhance the JJSL conjugate parameter,initiall...As a generalization of the two-term conjugate gradient method(CGM),the spectral CGM is one of the effective methods for solving unconstrained optimization.In this paper,we enhance the JJSL conjugate parameter,initially proposed by Jiang et al.(Computational and Applied Mathematics,2021,40:174),through the utilization of a convex combination technique.And this improvement allows for an adaptive search direction by integrating a newly constructed spectral gradient-type restart strategy.Then,we develop a new spectral CGM by employing an inexact line search to determine the step size.With the application of the weak Wolfe line search,we establish the sufficient descent property of the proposed search direction.Moreover,under general assumptions,including the employment of the strong Wolfe line search for step size calculation,we demonstrate the global convergence of our new algorithm.Finally,the given unconstrained optimization test results show that the new algorithm is effective.展开更多
In this paper, the non-quasi-Newton's family with inexact line search applied to unconstrained optimization problems is studied. A new update formula for non-quasi-Newton's family is proposed. It is proved that the ...In this paper, the non-quasi-Newton's family with inexact line search applied to unconstrained optimization problems is studied. A new update formula for non-quasi-Newton's family is proposed. It is proved that the constituted algorithm with either Wolfe-type or Armijotype line search converges globally and Q-superlinearly if the function to be minimized has Lipschitz continuous gradient.展开更多
This paper studies thee convergence properties of multiplicative iterative algorithms with inexact line search. We prove that the convergence can be guaranteed for a general form of line search rille, under the assu...This paper studies thee convergence properties of multiplicative iterative algorithms with inexact line search. We prove that the convergence can be guaranteed for a general form of line search rille, under the assumption of convexity of objective function or the assumption of convergence of the sequence generated by the algorithm. This answers an open problem put forward by lusem.展开更多
基金supported by the National Natural Science Foundation of China (No.72071202)the Key Laboratory of Mathematics and Engineering ApplicationsMinistry of Education。
文摘As a generalization of the two-term conjugate gradient method(CGM),the spectral CGM is one of the effective methods for solving unconstrained optimization.In this paper,we enhance the JJSL conjugate parameter,initially proposed by Jiang et al.(Computational and Applied Mathematics,2021,40:174),through the utilization of a convex combination technique.And this improvement allows for an adaptive search direction by integrating a newly constructed spectral gradient-type restart strategy.Then,we develop a new spectral CGM by employing an inexact line search to determine the step size.With the application of the weak Wolfe line search,we establish the sufficient descent property of the proposed search direction.Moreover,under general assumptions,including the employment of the strong Wolfe line search for step size calculation,we demonstrate the global convergence of our new algorithm.Finally,the given unconstrained optimization test results show that the new algorithm is effective.
文摘In this paper, the non-quasi-Newton's family with inexact line search applied to unconstrained optimization problems is studied. A new update formula for non-quasi-Newton's family is proposed. It is proved that the constituted algorithm with either Wolfe-type or Armijotype line search converges globally and Q-superlinearly if the function to be minimized has Lipschitz continuous gradient.
文摘This paper studies thee convergence properties of multiplicative iterative algorithms with inexact line search. We prove that the convergence can be guaranteed for a general form of line search rille, under the assumption of convexity of objective function or the assumption of convergence of the sequence generated by the algorithm. This answers an open problem put forward by lusem.