Conjugate gradient optimization algorithms depend on the search directions with different choices for the parameters in the search directions. In this note, by combining the nice numerical performance of PR and HS met...Conjugate gradient optimization algorithms depend on the search directions with different choices for the parameters in the search directions. In this note, by combining the nice numerical performance of PR and HS methods with the global convergence property of the class of conjugate gradient methods presented by HU and STOREY(1991), a class of new restarting conjugate gradient methods is presented. Global convergences of the new method with two kinds of common line searches, are proved. Firstly, it is shown that, using reverse modulus of continuity function and forcing function, the new method for solving unconstrained optimization can work for a continously dif ferentiable function with Curry-Altman's step size rule and a bounded level set. Secondly, by using comparing technique, some general convergence properties of the new method with other kind of step size rule are established. Numerical experiments show that the new method is efficient by comparing with FR conjugate gradient method.展开更多
Motion estimation is an important part of the MPEG- 4 encoder, due to its significant impact on the bit rate and the output quality of the encoder sequence. Unfortunately this feature takes a significant part of the e...Motion estimation is an important part of the MPEG- 4 encoder, due to its significant impact on the bit rate and the output quality of the encoder sequence. Unfortunately this feature takes a significant part of the encoding time especially when the straightforward full search(FS) algorithm is used. In this paper, a new algorithm named diamond block based gradient descent search (DBBGDS) algorithm, which is significantly faster than FS and gives similar quality of the output sequence, is proposed. At the same time, some other algorithms, such as three step search (TSS), improved three step search (ITSS), new three step search (NTSS), four step search (4SS), cellular search (CS) , diamond search (DS) and block based gradient descent search (BBGDS), are adopted and compared with DBBGDS. As the experimental results show, DBBGDS has its own advantages. Although DS has been adopted by the MPEG- 4 VM, its output sequence quality is worse than that of the proposed algorithm while its complexity is similar to the proposed one. Compared with BBGDS, the proposed algorithm can achieve a better output quality.展开更多
Spectral conjugate gradient method is an algorithm obtained by combination of spectral gradient method and conjugate gradient method,which is characterized with global convergence and simplicity of spectral gradient m...Spectral conjugate gradient method is an algorithm obtained by combination of spectral gradient method and conjugate gradient method,which is characterized with global convergence and simplicity of spectral gradient method,and small storage of conjugate gradient method.Besides,the spectral conjugate gradient method was proved that the search direction at each iteration is a descent direction of objective function even without relying on any line search method.Spectral conjugate gradient method is applied to full waveform inversion for numerical tests on Marmousi model.The authors give a comparison on numerical results obtained by steepest descent method,conjugate gradient method and spectral conjugate gradient method,which shows that the spectral conjugate gradient method is superior to the other two methods.展开更多
Conjugate gradient methods have played a special role in solving large scale nonlinear problems. Recently, the author and Dai proposed an efficient nonlinear conjugate gradient method called CGOPT, through seeking the...Conjugate gradient methods have played a special role in solving large scale nonlinear problems. Recently, the author and Dai proposed an efficient nonlinear conjugate gradient method called CGOPT, through seeking the conjugate gradient direction closest to the direction of the scaled memoryless BFGS method. In this paper, we make use of two types of modified secant equations to improve CGOPT method. Under some assumptions, the improved methods are showed to be globally convergent. Numerical results are also reported.展开更多
In this article, a new descent memory gradient method without restarts is proposed for solving large scale unconstrained optimization problems. The method has the following attractive properties: 1) The search direc...In this article, a new descent memory gradient method without restarts is proposed for solving large scale unconstrained optimization problems. The method has the following attractive properties: 1) The search direction is always a sufficiently descent direction at every iteration without the line search used; 2) The search direction always satisfies the angle property, which is independent of the convexity of the objective function. Under mild conditions, the authors prove that the proposed method has global convergence, and its convergence rate is also investigated. The numerical results show that the new descent memory method is efficient for the given test problems.展开更多
文摘Conjugate gradient optimization algorithms depend on the search directions with different choices for the parameters in the search directions. In this note, by combining the nice numerical performance of PR and HS methods with the global convergence property of the class of conjugate gradient methods presented by HU and STOREY(1991), a class of new restarting conjugate gradient methods is presented. Global convergences of the new method with two kinds of common line searches, are proved. Firstly, it is shown that, using reverse modulus of continuity function and forcing function, the new method for solving unconstrained optimization can work for a continously dif ferentiable function with Curry-Altman's step size rule and a bounded level set. Secondly, by using comparing technique, some general convergence properties of the new method with other kind of step size rule are established. Numerical experiments show that the new method is efficient by comparing with FR conjugate gradient method.
文摘Motion estimation is an important part of the MPEG- 4 encoder, due to its significant impact on the bit rate and the output quality of the encoder sequence. Unfortunately this feature takes a significant part of the encoding time especially when the straightforward full search(FS) algorithm is used. In this paper, a new algorithm named diamond block based gradient descent search (DBBGDS) algorithm, which is significantly faster than FS and gives similar quality of the output sequence, is proposed. At the same time, some other algorithms, such as three step search (TSS), improved three step search (ITSS), new three step search (NTSS), four step search (4SS), cellular search (CS) , diamond search (DS) and block based gradient descent search (BBGDS), are adopted and compared with DBBGDS. As the experimental results show, DBBGDS has its own advantages. Although DS has been adopted by the MPEG- 4 VM, its output sequence quality is worse than that of the proposed algorithm while its complexity is similar to the proposed one. Compared with BBGDS, the proposed algorithm can achieve a better output quality.
文摘Spectral conjugate gradient method is an algorithm obtained by combination of spectral gradient method and conjugate gradient method,which is characterized with global convergence and simplicity of spectral gradient method,and small storage of conjugate gradient method.Besides,the spectral conjugate gradient method was proved that the search direction at each iteration is a descent direction of objective function even without relying on any line search method.Spectral conjugate gradient method is applied to full waveform inversion for numerical tests on Marmousi model.The authors give a comparison on numerical results obtained by steepest descent method,conjugate gradient method and spectral conjugate gradient method,which shows that the spectral conjugate gradient method is superior to the other two methods.
基金supported by National Natural Science Foundation of China(Grant Nos.10831006 and 10971017)
文摘Conjugate gradient methods have played a special role in solving large scale nonlinear problems. Recently, the author and Dai proposed an efficient nonlinear conjugate gradient method called CGOPT, through seeking the conjugate gradient direction closest to the direction of the scaled memoryless BFGS method. In this paper, we make use of two types of modified secant equations to improve CGOPT method. Under some assumptions, the improved methods are showed to be globally convergent. Numerical results are also reported.
基金supported by the National Science Foundation of China under Grant No.70971076the Foundation of Shandong Provincial Education Department under Grant No.J10LA59
文摘In this article, a new descent memory gradient method without restarts is proposed for solving large scale unconstrained optimization problems. The method has the following attractive properties: 1) The search direction is always a sufficiently descent direction at every iteration without the line search used; 2) The search direction always satisfies the angle property, which is independent of the convexity of the objective function. Under mild conditions, the authors prove that the proposed method has global convergence, and its convergence rate is also investigated. The numerical results show that the new descent memory method is efficient for the given test problems.