In this paper, we present a theoretical analysis for linear finite element superconvergent gradient recovery on Par6 mesh, the dual of which is centroidal Voronoi tessellations with the lowest energy per unit volume a...In this paper, we present a theoretical analysis for linear finite element superconvergent gradient recovery on Par6 mesh, the dual of which is centroidal Voronoi tessellations with the lowest energy per unit volume and is the congruent cell predicted by the three-dimensional Gersho's conjecture. We show that the linear finite element solution uh and the linear interpolation uI have superclose gradient on Par6 meshes. Consequently, the gradient recovered from the finite element solution by using the superconvergence patch recovery method is superconvergent to Vu. A numerical example is presented to verify the theoretical result.展开更多
Full waveform inversion( FWI) is a challenging data-fitting procedure between model wave field value and theoretical wave field value. The essence of FWI is an optimization problem,and therefore,it is important to stu...Full waveform inversion( FWI) is a challenging data-fitting procedure between model wave field value and theoretical wave field value. The essence of FWI is an optimization problem,and therefore,it is important to study optimization method. The study is based on conventional Memoryless quasi-Newton( MLQN)method. Because the Conjugate Gradient method has ultra linear convergence,the authors propose a method by using Fletcher-Reeves( FR) conjugate gradient information to improve the search direction of the conventional MLQN method. The improved MLQN method not only includes the gradient information and model information,but also contains conjugate gradient information. And it does not increase the amount of calculation during every iterative process. Numerical experiment shows that compared with conventional MLQN method,the improved MLQN method can guarantee the computational efficiency and improve the inversion precision.展开更多
Spectral conjugate gradient method is an algorithm obtained by combination of spectral gradient method and conjugate gradient method,which is characterized with global convergence and simplicity of spectral gradient m...Spectral conjugate gradient method is an algorithm obtained by combination of spectral gradient method and conjugate gradient method,which is characterized with global convergence and simplicity of spectral gradient method,and small storage of conjugate gradient method.Besides,the spectral conjugate gradient method was proved that the search direction at each iteration is a descent direction of objective function even without relying on any line search method.Spectral conjugate gradient method is applied to full waveform inversion for numerical tests on Marmousi model.The authors give a comparison on numerical results obtained by steepest descent method,conjugate gradient method and spectral conjugate gradient method,which shows that the spectral conjugate gradient method is superior to the other two methods.展开更多
In this paper,by utilizing the angle of arrivals(AOAs) and imprecise positions of the sensors,a novel modified Levenberg-Marquardt algorithm to solve the source localization problem is proposed.Conventional source loc...In this paper,by utilizing the angle of arrivals(AOAs) and imprecise positions of the sensors,a novel modified Levenberg-Marquardt algorithm to solve the source localization problem is proposed.Conventional source localization algorithms,like Gauss-Newton algorithm and Conjugate gradient algorithm are subjected to the problems of local minima and good initial guess.This paper presents a new optimization technique to find the descent directions to avoid divergence,and a trust region method is introduced to accelerate the convergence rate.Compared with conventional methods,the new algorithm offers increased stability and is more robust,allowing for stronger non-linearity and wider convergence field to be identified.Simulation results demonstrate that the proposed algorithm improves the typical methods in both speed and robustness,and is able to avoid local minima.展开更多
Conjugate gradient optimization algorithms depend on the search directions with different choices for the parameters in the search directions. In this note, by combining the nice numerical performance of PR and HS met...Conjugate gradient optimization algorithms depend on the search directions with different choices for the parameters in the search directions. In this note, by combining the nice numerical performance of PR and HS methods with the global convergence property of the class of conjugate gradient methods presented by HU and STOREY(1991), a class of new restarting conjugate gradient methods is presented. Global convergences of the new method with two kinds of common line searches, are proved. Firstly, it is shown that, using reverse modulus of continuity function and forcing function, the new method for solving unconstrained optimization can work for a continously dif ferentiable function with Curry-Altman's step size rule and a bounded level set. Secondly, by using comparing technique, some general convergence properties of the new method with other kind of step size rule are established. Numerical experiments show that the new method is efficient by comparing with FR conjugate gradient method.展开更多
Based on the approximate sparseness of speech in wavelet basis,a compressed sensing theory is applied to compress and reconstruct speech signals.Compared with one-dimensional orthogonal wavelet transform(OWT),two-dime...Based on the approximate sparseness of speech in wavelet basis,a compressed sensing theory is applied to compress and reconstruct speech signals.Compared with one-dimensional orthogonal wavelet transform(OWT),two-dimensional OWT combined with Dmeyer and biorthogonal wavelet is firstly proposed to raise running efficiency in speech frame processing,furthermore,the threshold is set to improve the sparseness.Then an adaptive subgradient projection method(ASPM)is adopted for speech reconstruction in compressed sensing.Meanwhile,mechanism which adaptively adjusts inflation parameter in different iterations has been designed for fast convergence.Theoretical analysis and simulation results conclude that this algorithm has fast convergence,and lower reconstruction error,and also exhibits higher robustness in different noise intensities.展开更多
This paper puts forward a two-parameter family of nonlinear conjugate gradient(CG)method without line search for solving unconstrained optimization problem.The main feature of this method is that it does not rely on a...This paper puts forward a two-parameter family of nonlinear conjugate gradient(CG)method without line search for solving unconstrained optimization problem.The main feature of this method is that it does not rely on any line search and only requires a simple step size formula to always generate a sufficient descent direction.Under certain assumptions,the proposed method is proved to possess global convergence.Finally,our method is compared with other potential methods.A large number of numerical experiments show that our method is more competitive and effective.展开更多
A discussion is given on the convergence of the on-line gradient methods for two-layer feedforward neural networks in general cases. The theories are applied to some usual activation functions and energy functions.
In this paper, a new region of βk with respect to ;βk^PRP is given. With two Armijo-type line searches, the authors investigate the global convergence properties of the dependent PRP conjugate gradient methods, whic...In this paper, a new region of βk with respect to ;βk^PRP is given. With two Armijo-type line searches, the authors investigate the global convergence properties of the dependent PRP conjugate gradient methods, which extend the global convergence results of PRP conjugate gradient method proved by Grippo and Lucidi (1997) and Dai and Yuan (2002).展开更多
Abstract. Conjugate gradient methods are very important methods for unconstrainedoptimization, especially for large scale problems. In this paper, we propose a new conjugategradient method, in which the technique of n...Abstract. Conjugate gradient methods are very important methods for unconstrainedoptimization, especially for large scale problems. In this paper, we propose a new conjugategradient method, in which the technique of nonmonotone line search is used. Under mildassumptions, we prove the global convergence of the method. Some numerical results arealso presented.展开更多
The matrix rank minimization problem arises in many engineering applications. As this problem is NP-hard, a nonconvex relaxation of matrix rank minimization, called the Schatten-p quasi-norm minimization(0 < p <...The matrix rank minimization problem arises in many engineering applications. As this problem is NP-hard, a nonconvex relaxation of matrix rank minimization, called the Schatten-p quasi-norm minimization(0 < p < 1), has been developed to approximate the rank function closely. We study the performance of projected gradient descent algorithm for solving the Schatten-p quasi-norm minimization(0 < p < 1) problem.Based on the matrix restricted isometry property(M-RIP), we give the convergence guarantee and error bound for this algorithm and show that the algorithm is robust to noise with an exponential convergence rate.展开更多
基金supported by Singapore AcRF RG59/08 (M52110092)Singapore NRF 2007 IDM-IDM002-010.
文摘In this paper, we present a theoretical analysis for linear finite element superconvergent gradient recovery on Par6 mesh, the dual of which is centroidal Voronoi tessellations with the lowest energy per unit volume and is the congruent cell predicted by the three-dimensional Gersho's conjecture. We show that the linear finite element solution uh and the linear interpolation uI have superclose gradient on Par6 meshes. Consequently, the gradient recovered from the finite element solution by using the superconvergence patch recovery method is superconvergent to Vu. A numerical example is presented to verify the theoretical result.
文摘Full waveform inversion( FWI) is a challenging data-fitting procedure between model wave field value and theoretical wave field value. The essence of FWI is an optimization problem,and therefore,it is important to study optimization method. The study is based on conventional Memoryless quasi-Newton( MLQN)method. Because the Conjugate Gradient method has ultra linear convergence,the authors propose a method by using Fletcher-Reeves( FR) conjugate gradient information to improve the search direction of the conventional MLQN method. The improved MLQN method not only includes the gradient information and model information,but also contains conjugate gradient information. And it does not increase the amount of calculation during every iterative process. Numerical experiment shows that compared with conventional MLQN method,the improved MLQN method can guarantee the computational efficiency and improve the inversion precision.
文摘Spectral conjugate gradient method is an algorithm obtained by combination of spectral gradient method and conjugate gradient method,which is characterized with global convergence and simplicity of spectral gradient method,and small storage of conjugate gradient method.Besides,the spectral conjugate gradient method was proved that the search direction at each iteration is a descent direction of objective function even without relying on any line search method.Spectral conjugate gradient method is applied to full waveform inversion for numerical tests on Marmousi model.The authors give a comparison on numerical results obtained by steepest descent method,conjugate gradient method and spectral conjugate gradient method,which shows that the spectral conjugate gradient method is superior to the other two methods.
基金Supported by the National High Technology Research and Development Programme of China(No.2011AA7014061)
文摘In this paper,by utilizing the angle of arrivals(AOAs) and imprecise positions of the sensors,a novel modified Levenberg-Marquardt algorithm to solve the source localization problem is proposed.Conventional source localization algorithms,like Gauss-Newton algorithm and Conjugate gradient algorithm are subjected to the problems of local minima and good initial guess.This paper presents a new optimization technique to find the descent directions to avoid divergence,and a trust region method is introduced to accelerate the convergence rate.Compared with conventional methods,the new algorithm offers increased stability and is more robust,allowing for stronger non-linearity and wider convergence field to be identified.Simulation results demonstrate that the proposed algorithm improves the typical methods in both speed and robustness,and is able to avoid local minima.
文摘Conjugate gradient optimization algorithms depend on the search directions with different choices for the parameters in the search directions. In this note, by combining the nice numerical performance of PR and HS methods with the global convergence property of the class of conjugate gradient methods presented by HU and STOREY(1991), a class of new restarting conjugate gradient methods is presented. Global convergences of the new method with two kinds of common line searches, are proved. Firstly, it is shown that, using reverse modulus of continuity function and forcing function, the new method for solving unconstrained optimization can work for a continously dif ferentiable function with Curry-Altman's step size rule and a bounded level set. Secondly, by using comparing technique, some general convergence properties of the new method with other kind of step size rule are established. Numerical experiments show that the new method is efficient by comparing with FR conjugate gradient method.
基金Supported by the National Natural Science Foundation of China(No.60472058,60975017)the Fundamental Research Funds for the Central Universities(No.2009B32614,2009B32414)
文摘Based on the approximate sparseness of speech in wavelet basis,a compressed sensing theory is applied to compress and reconstruct speech signals.Compared with one-dimensional orthogonal wavelet transform(OWT),two-dimensional OWT combined with Dmeyer and biorthogonal wavelet is firstly proposed to raise running efficiency in speech frame processing,furthermore,the threshold is set to improve the sparseness.Then an adaptive subgradient projection method(ASPM)is adopted for speech reconstruction in compressed sensing.Meanwhile,mechanism which adaptively adjusts inflation parameter in different iterations has been designed for fast convergence.Theoretical analysis and simulation results conclude that this algorithm has fast convergence,and lower reconstruction error,and also exhibits higher robustness in different noise intensities.
基金Supported by 2023 Inner Mongolia University of Finance and Economics,General Scientific Research for Universities directly under Inner Mon‐golia,China (NCYWT23026)2024 High-quality Research Achievements Cultivation Fund Project of Inner Mongolia University of Finance and Economics,China (GZCG2479)。
文摘This paper puts forward a two-parameter family of nonlinear conjugate gradient(CG)method without line search for solving unconstrained optimization problem.The main feature of this method is that it does not rely on any line search and only requires a simple step size formula to always generate a sufficient descent direction.Under certain assumptions,the proposed method is proved to possess global convergence.Finally,our method is compared with other potential methods.A large number of numerical experiments show that our method is more competitive and effective.
基金Supported by the Natural Science Foundation of China
文摘A discussion is given on the convergence of the on-line gradient methods for two-layer feedforward neural networks in general cases. The theories are applied to some usual activation functions and energy functions.
基金This work is supported by National Science Foundation of China(10571106)the Foundation of Qufu Normal University.
文摘In this paper, a new region of βk with respect to ;βk^PRP is given. With two Armijo-type line searches, the authors investigate the global convergence properties of the dependent PRP conjugate gradient methods, which extend the global convergence results of PRP conjugate gradient method proved by Grippo and Lucidi (1997) and Dai and Yuan (2002).
基金the National Natural Science Foundation of China(19801033,10171104).
文摘Abstract. Conjugate gradient methods are very important methods for unconstrainedoptimization, especially for large scale problems. In this paper, we propose a new conjugategradient method, in which the technique of nonmonotone line search is used. Under mildassumptions, we prove the global convergence of the method. Some numerical results arealso presented.
基金supported by National Natural Science Foundation of China(Grant No.11171299)
文摘The matrix rank minimization problem arises in many engineering applications. As this problem is NP-hard, a nonconvex relaxation of matrix rank minimization, called the Schatten-p quasi-norm minimization(0 < p < 1), has been developed to approximate the rank function closely. We study the performance of projected gradient descent algorithm for solving the Schatten-p quasi-norm minimization(0 < p < 1) problem.Based on the matrix restricted isometry property(M-RIP), we give the convergence guarantee and error bound for this algorithm and show that the algorithm is robust to noise with an exponential convergence rate.