The degree of numerical linear independence is proposed and discussed. Based on this linear independence theory, a modified limited memory BFGS method is deve loped. Similar to the standard limited memory method, thi...The degree of numerical linear independence is proposed and discussed. Based on this linear independence theory, a modified limited memory BFGS method is deve loped. Similar to the standard limited memory method, this new method determines the new update by applying the updating formula m times to an initial positive diagonal matrix using the m previous pairs of the change in iteration and gradient. Besides the most recent pair of the change, which guarantees the quadratic termination, the choice of the other ( m -1) pairs of the change in the new method is dependent on the degree of numerical linear independence of previous search directions. In addition, the numerical linear independence theory is further discussed and the computation of the degree of linear independence is simplified. Theoretical and numerical results show that this new modified method improves efficiently the standard limited memory method.展开更多
This paper presents a new class of quasi-Newton methods for solving unconstrained minimization problems. The methods can be regarded as a generalization of Huang class of quasi-Newton methods. We prove that the direct...This paper presents a new class of quasi-Newton methods for solving unconstrained minimization problems. The methods can be regarded as a generalization of Huang class of quasi-Newton methods. We prove that the directions and the iterations generated by the methods of the new class depend only on the parameter p if the exact line searches are made in each steps.展开更多
Abstract. Conjugate gradient methods are very important methods for unconstrainedoptimization, especially for large scale problems. In this paper, we propose a new conjugategradient method, in which the technique of n...Abstract. Conjugate gradient methods are very important methods for unconstrainedoptimization, especially for large scale problems. In this paper, we propose a new conjugategradient method, in which the technique of nonmonotone line search is used. Under mildassumptions, we prove the global convergence of the method. Some numerical results arealso presented.展开更多
For unconstrained optimization, a new hybrid projection algorithm is presented m the paper. This algorithm has some attractive convergence properties. Convergence theory can be obtained under the condition that Δ↓f...For unconstrained optimization, a new hybrid projection algorithm is presented m the paper. This algorithm has some attractive convergence properties. Convergence theory can be obtained under the condition that Δ↓f(x) is uniformly continuous. If Δ↓f(x) is continuously differentiable pseudo-convex, the whole sequence of iterates converges to a solution of the problem without any other assumptions. Furthermore, under appropriate conditions one shows that the sequence of iterates has a cluster-point if and only if Ω* ≠ θ. Numerical examples are given at the end of this paper.展开更多
In this paper, a new class of memoryless non-quasi-Newton method for solving unconstrained optimization problems is proposed, and the global convergence of this method with inexact line search is proved. Furthermore, ...In this paper, a new class of memoryless non-quasi-Newton method for solving unconstrained optimization problems is proposed, and the global convergence of this method with inexact line search is proved. Furthermore, we propose a hybrid method that mixes both the memoryless non-quasi-Newton method and the memoryless Perry-Shanno quasi-Newton method. The global convergence of this hybrid memoryless method is proved under mild assumptions. The initial results show that these new methods are efficient for the given test problems. Especially the memoryless non-quasi-Newton method requires little storage and computation, so it is able to efficiently solve large scale optimization problems.展开更多
文摘The degree of numerical linear independence is proposed and discussed. Based on this linear independence theory, a modified limited memory BFGS method is deve loped. Similar to the standard limited memory method, this new method determines the new update by applying the updating formula m times to an initial positive diagonal matrix using the m previous pairs of the change in iteration and gradient. Besides the most recent pair of the change, which guarantees the quadratic termination, the choice of the other ( m -1) pairs of the change in the new method is dependent on the degree of numerical linear independence of previous search directions. In addition, the numerical linear independence theory is further discussed and the computation of the degree of linear independence is simplified. Theoretical and numerical results show that this new modified method improves efficiently the standard limited memory method.
文摘This paper presents a new class of quasi-Newton methods for solving unconstrained minimization problems. The methods can be regarded as a generalization of Huang class of quasi-Newton methods. We prove that the directions and the iterations generated by the methods of the new class depend only on the parameter p if the exact line searches are made in each steps.
基金the National Natural Science Foundation of China(19801033,10171104).
文摘Abstract. Conjugate gradient methods are very important methods for unconstrainedoptimization, especially for large scale problems. In this paper, we propose a new conjugategradient method, in which the technique of nonmonotone line search is used. Under mildassumptions, we prove the global convergence of the method. Some numerical results arealso presented.
基金This work is supported by National Natural Science Foundation under Grants No. 10571106 and 10471159.
文摘For unconstrained optimization, a new hybrid projection algorithm is presented m the paper. This algorithm has some attractive convergence properties. Convergence theory can be obtained under the condition that Δ↓f(x) is uniformly continuous. If Δ↓f(x) is continuously differentiable pseudo-convex, the whole sequence of iterates converges to a solution of the problem without any other assumptions. Furthermore, under appropriate conditions one shows that the sequence of iterates has a cluster-point if and only if Ω* ≠ θ. Numerical examples are given at the end of this paper.
基金Foundation item: the National Natural Science Foundation of China (No. 60472071) the Science Foundation of Beijing Municipal Commission of Education (No. KM200710028001).
文摘In this paper, a new class of memoryless non-quasi-Newton method for solving unconstrained optimization problems is proposed, and the global convergence of this method with inexact line search is proved. Furthermore, we propose a hybrid method that mixes both the memoryless non-quasi-Newton method and the memoryless Perry-Shanno quasi-Newton method. The global convergence of this hybrid memoryless method is proved under mild assumptions. The initial results show that these new methods are efficient for the given test problems. Especially the memoryless non-quasi-Newton method requires little storage and computation, so it is able to efficiently solve large scale optimization problems.