The degree of numerical linear independence is proposed and discussed. Based on this linear independence theory, a modified limited memory BFGS method is deve loped. Similar to the standard limited memory method, thi...The degree of numerical linear independence is proposed and discussed. Based on this linear independence theory, a modified limited memory BFGS method is deve loped. Similar to the standard limited memory method, this new method determines the new update by applying the updating formula m times to an initial positive diagonal matrix using the m previous pairs of the change in iteration and gradient. Besides the most recent pair of the change, which guarantees the quadratic termination, the choice of the other ( m -1) pairs of the change in the new method is dependent on the degree of numerical linear independence of previous search directions. In addition, the numerical linear independence theory is further discussed and the computation of the degree of linear independence is simplified. Theoretical and numerical results show that this new modified method improves efficiently the standard limited memory method.展开更多
In this paper we propose an algorithm based on the BFGS Quasi-Newton method to solve a linear program. The choice of this method is justified by its theoretical efficiency, the ease to determine a descent direction an...In this paper we propose an algorithm based on the BFGS Quasi-Newton method to solve a linear program. The choice of this method is justified by its theoretical efficiency, the ease to determine a descent direction and its fast convergence towards an optimal solution. Our proposed method is compared with Newton's method for linear program named lpnew, widely used as an optimization algorithm for classification problems.展开更多
In this paper, we propose an algorithm for solving nonlinear monotone equations by combining the limited memory BFGS method (L-BFGS) with a projection method. We show that the method is globally convergent if the eq...In this paper, we propose an algorithm for solving nonlinear monotone equations by combining the limited memory BFGS method (L-BFGS) with a projection method. We show that the method is globally convergent if the equation involves a Lipschitz continuous monotone function. We also present some preliminary numerical results.展开更多
In this paper,a new modified BFGS method without line searches is proposed.Unlike traditionalBFGS method,this modified BFGS method is proposed based on the so-called fixed steplengthstrategy introduced by Sun and Zhan...In this paper,a new modified BFGS method without line searches is proposed.Unlike traditionalBFGS method,this modified BFGS method is proposed based on the so-called fixed steplengthstrategy introduced by Sun and Zhang.Under some suitable assumptions,the global convergence andthe superlinear convergence of the new algorithm are established,respectively.And some preliminarynumerical experiments,which shows that the new Algorithm is feasible,is also reported.展开更多
In this paper, a modified limited memory BFGS method for solving large-scale unconstrained optimization problems is proposed. A remarkable feature of the proposed method is that it possesses global convergence propert...In this paper, a modified limited memory BFGS method for solving large-scale unconstrained optimization problems is proposed. A remarkable feature of the proposed method is that it possesses global convergence property without convexity assumption on the objective function. Under some suitable conditions, the global convergence of the proposed method is proved. Some numerical results are reported which illustrate that the proposed method is efficient.展开更多
In full waveform inversion (FWI), Hessian information of the misfit function is of vital importance for accelerating the convergence of the inversion; however, it usually is not feasible to directly calculate the He...In full waveform inversion (FWI), Hessian information of the misfit function is of vital importance for accelerating the convergence of the inversion; however, it usually is not feasible to directly calculate the Hessian matrix and its inverse. Although the limited memory Broyden-Fletcher-Goldfarb-Shanno (L-BFGS) or Hessian-free inexact Newton (HFN) methods are able to use approximate Hessian information, the information they collect is limited. The two methods can be interlaced because they are able to provide Hessian information for each other; however, the performance of the hybrid iterative method is dependent on the effective switch between the two methods. We have designed a new scheme to realize the dynamic switch between the two methods based on the decrease ratio (DR) of the misfit function (objective function), and we propose a modified hybrid iterative optimization method. In the new scheme, we compare the DR of the two methods for a given computational cost, and choose the method with a faster DR. Using these steps, the modified method always implements the most efficient method. The results of Marmousi and overthrust model testings indicate that the convergence with our modified method is significantly faster than that in the L-BFGS method with no loss of inversion quality. Moreover, our modified outperforms the enriched method by a little speedup of the convergence. It also exhibits better efficiency than the HFN method.展开更多
With the emergence of location-based applications in various fields, the higher accuracy of positioning is demanded. By utilizing the time differences of arrival (TDOAs) and gain ratios of arrival (GROAs), an effi...With the emergence of location-based applications in various fields, the higher accuracy of positioning is demanded. By utilizing the time differences of arrival (TDOAs) and gain ratios of arrival (GROAs), an efficient algorithm for estimating the position is proposed, which exploits the Broyden-Fletcher-Goldfarb-Shanno (BFGS) quasi-Newton method to solve nonlinear equations at the source location under the additive measurement error. Although the accuracy of two-step weighted-least-square (WLS) method based on TDOAs and GROAs is very high, this method has a high computational complexity. While the proposed approach can achieve the same accuracy and bias with the lower computational complexity when the signal-to-noise ratio (SNR) is high, especially it can achieve better accuracy and smaller bias at a lower SNR. The proposed algorithm can be applied to the actual environment due to its real-time property and good robust performance. Simulation results show that with a good initial guess to begin with, the proposed estimator converges to the true solution and achieves the Cramer-Rao lower bound (CRLB) accuracy for both near-field and far-field sources.展开更多
The line search strategy is crucial for an efficient unconstrained optimization algorithm. One of the reason why the Wolfe line searches is recommended lies in that it ensures positive definiteness of BFGS updates. Wh...The line search strategy is crucial for an efficient unconstrained optimization algorithm. One of the reason why the Wolfe line searches is recommended lies in that it ensures positive definiteness of BFGS updates. When gradient information has to be obtained costly, the Armijo-Goldstein line searches may be preferred. To maintain positive difiniteness of BFGS updates based on the Armijo-Goldstein line searches, a slightly modified form of BFGS update is proposed by I.D. Coope and C.J. Price (Journal of Computational Mathematics, 13 (1995), 156-160), while its convergence properties is open up to now. This paper shows that the modified BFGS algorithm is globally and superlinearly convergent based on the Armijo-Goldstein line searches.展开更多
The self-scaling quasi-Newton method solves an unconstrained optimization problem by scaling the Hessian approximation matrix before it is updated at each iteration to avoid the possible large eigenvalues in the Hessi...The self-scaling quasi-Newton method solves an unconstrained optimization problem by scaling the Hessian approximation matrix before it is updated at each iteration to avoid the possible large eigenvalues in the Hessian approximation matrices of the objective function. It has been proved in the literature that this method has the global and superlinear convergence when the objective function is convex (or even uniformly convex). We propose to solve unconstrained nonconvex optimization problems by a self-scaling BFGS algorithm with nonmonotone linear search. Nonmonotone line search has been recognized in numerical practices as a competitive approach for solving large-scale nonlinear problems. We consider two different nonmonotone line search forms and study the global convergence of these nonmonotone self-scale BFGS algorithms. We prove that, under some weaker condition than that in the literature, both forms of the self-scaling BFGS algorithm are globally convergent for unconstrained nonconvex optimization problems.展开更多
This paper presents a stochastic modification of a limited memory BFGS method to solve bound-constrained global minimization problems with a differentiable cost function with no further smoothness. The approach is a s...This paper presents a stochastic modification of a limited memory BFGS method to solve bound-constrained global minimization problems with a differentiable cost function with no further smoothness. The approach is a stochastic descent method where the deterministic sequence, generated by a limited memory BFGS method, is replaced by a sequence of random variables. To enhance the performance of the proposed algorithm and make sure the perturbations lie within the feasible domain, we have developed a novel perturbation technique based on truncating a multivariate double exponential distribution to deal with bound-constrained problems;the theoretical study and the simulation of the developed truncated distribution are also presented. Theoretical results ensure that the proposed method converges almost surely to the global minimum. The performance of the algorithm is demonstrated through numerical experiments on some typical test functions as well as on some further engineering problems. The numerical comparisons with stochastic and meta-heuristic methods indicate that the suggested algorithm is promising.展开更多
文摘The degree of numerical linear independence is proposed and discussed. Based on this linear independence theory, a modified limited memory BFGS method is deve loped. Similar to the standard limited memory method, this new method determines the new update by applying the updating formula m times to an initial positive diagonal matrix using the m previous pairs of the change in iteration and gradient. Besides the most recent pair of the change, which guarantees the quadratic termination, the choice of the other ( m -1) pairs of the change in the new method is dependent on the degree of numerical linear independence of previous search directions. In addition, the numerical linear independence theory is further discussed and the computation of the degree of linear independence is simplified. Theoretical and numerical results show that this new modified method improves efficiently the standard limited memory method.
文摘In this paper we propose an algorithm based on the BFGS Quasi-Newton method to solve a linear program. The choice of this method is justified by its theoretical efficiency, the ease to determine a descent direction and its fast convergence towards an optimal solution. Our proposed method is compared with Newton's method for linear program named lpnew, widely used as an optimization algorithm for classification problems.
基金Support by NSF of China grant 10471036a 973 project
文摘In this paper, we propose an algorithm for solving nonlinear monotone equations by combining the limited memory BFGS method (L-BFGS) with a projection method. We show that the method is globally convergent if the equation involves a Lipschitz continuous monotone function. We also present some preliminary numerical results.
基金supported by the Foundation of National Natural Science Foundation of China under Grant No. 10871226the Natural Science Foundation of Shandong Province under Grant No. ZR2009AL006+1 种基金the Development Project Foundation for Science Research of Shandong Education Department under Grant No. J09LA05the Science Project Foundation of Liaocheng University under Grant No. X0810027
文摘In this paper,a new modified BFGS method without line searches is proposed.Unlike traditionalBFGS method,this modified BFGS method is proposed based on the so-called fixed steplengthstrategy introduced by Sun and Zhang.Under some suitable assumptions,the global convergence andthe superlinear convergence of the new algorithm are established,respectively.And some preliminarynumerical experiments,which shows that the new Algorithm is feasible,is also reported.
基金Supported by National Natural Science Foundation of China(Grant11001075,11161003)Post-doctoral Foundation of China grant 20090461094the Natural Science Foundation of Henan Province Eduction Department grant 2010B110004
文摘In this paper, a modified limited memory BFGS method for solving large-scale unconstrained optimization problems is proposed. A remarkable feature of the proposed method is that it possesses global convergence property without convexity assumption on the objective function. Under some suitable conditions, the global convergence of the proposed method is proved. Some numerical results are reported which illustrate that the proposed method is efficient.
基金financially supported by the National Important and Special Project on Science and Technology(2011ZX05005-005-007HZ)the National Natural Science Foundation of China(No.41274116)
文摘In full waveform inversion (FWI), Hessian information of the misfit function is of vital importance for accelerating the convergence of the inversion; however, it usually is not feasible to directly calculate the Hessian matrix and its inverse. Although the limited memory Broyden-Fletcher-Goldfarb-Shanno (L-BFGS) or Hessian-free inexact Newton (HFN) methods are able to use approximate Hessian information, the information they collect is limited. The two methods can be interlaced because they are able to provide Hessian information for each other; however, the performance of the hybrid iterative method is dependent on the effective switch between the two methods. We have designed a new scheme to realize the dynamic switch between the two methods based on the decrease ratio (DR) of the misfit function (objective function), and we propose a modified hybrid iterative optimization method. In the new scheme, we compare the DR of the two methods for a given computational cost, and choose the method with a faster DR. Using these steps, the modified method always implements the most efficient method. The results of Marmousi and overthrust model testings indicate that the convergence with our modified method is significantly faster than that in the L-BFGS method with no loss of inversion quality. Moreover, our modified outperforms the enriched method by a little speedup of the convergence. It also exhibits better efficiency than the HFN method.
基金supported by the Major National Science&Technology Projects(2010ZX03006-002-04)the National Natural Science Foundation of China(61072070)+4 种基金the Doctorial Programs Foundation of the Ministry of Education(20110203110011)the"111 Project"(B08038)the Fundamental Research Funds of the Ministry of Education(72124338)the Key Programs for Natural Science Foundation of Shanxi Province(2012JZ8002)the Foundation of State Key Laboratory of Integrated Services Networks(ISN1101002)
文摘With the emergence of location-based applications in various fields, the higher accuracy of positioning is demanded. By utilizing the time differences of arrival (TDOAs) and gain ratios of arrival (GROAs), an efficient algorithm for estimating the position is proposed, which exploits the Broyden-Fletcher-Goldfarb-Shanno (BFGS) quasi-Newton method to solve nonlinear equations at the source location under the additive measurement error. Although the accuracy of two-step weighted-least-square (WLS) method based on TDOAs and GROAs is very high, this method has a high computational complexity. While the proposed approach can achieve the same accuracy and bias with the lower computational complexity when the signal-to-noise ratio (SNR) is high, especially it can achieve better accuracy and smaller bias at a lower SNR. The proposed algorithm can be applied to the actual environment due to its real-time property and good robust performance. Simulation results show that with a good initial guess to begin with, the proposed estimator converges to the true solution and achieves the Cramer-Rao lower bound (CRLB) accuracy for both near-field and far-field sources.
文摘The line search strategy is crucial for an efficient unconstrained optimization algorithm. One of the reason why the Wolfe line searches is recommended lies in that it ensures positive definiteness of BFGS updates. When gradient information has to be obtained costly, the Armijo-Goldstein line searches may be preferred. To maintain positive difiniteness of BFGS updates based on the Armijo-Goldstein line searches, a slightly modified form of BFGS update is proposed by I.D. Coope and C.J. Price (Journal of Computational Mathematics, 13 (1995), 156-160), while its convergence properties is open up to now. This paper shows that the modified BFGS algorithm is globally and superlinearly convergent based on the Armijo-Goldstein line searches.
基金supported by NSFC 10001031 and 70472074supported by NSERC Grant 283103
文摘The self-scaling quasi-Newton method solves an unconstrained optimization problem by scaling the Hessian approximation matrix before it is updated at each iteration to avoid the possible large eigenvalues in the Hessian approximation matrices of the objective function. It has been proved in the literature that this method has the global and superlinear convergence when the objective function is convex (or even uniformly convex). We propose to solve unconstrained nonconvex optimization problems by a self-scaling BFGS algorithm with nonmonotone linear search. Nonmonotone line search has been recognized in numerical practices as a competitive approach for solving large-scale nonlinear problems. We consider two different nonmonotone line search forms and study the global convergence of these nonmonotone self-scale BFGS algorithms. We prove that, under some weaker condition than that in the literature, both forms of the self-scaling BFGS algorithm are globally convergent for unconstrained nonconvex optimization problems.
文摘This paper presents a stochastic modification of a limited memory BFGS method to solve bound-constrained global minimization problems with a differentiable cost function with no further smoothness. The approach is a stochastic descent method where the deterministic sequence, generated by a limited memory BFGS method, is replaced by a sequence of random variables. To enhance the performance of the proposed algorithm and make sure the perturbations lie within the feasible domain, we have developed a novel perturbation technique based on truncating a multivariate double exponential distribution to deal with bound-constrained problems;the theoretical study and the simulation of the developed truncated distribution are also presented. Theoretical results ensure that the proposed method converges almost surely to the global minimum. The performance of the algorithm is demonstrated through numerical experiments on some typical test functions as well as on some further engineering problems. The numerical comparisons with stochastic and meta-heuristic methods indicate that the suggested algorithm is promising.