实际工程应用的很多非线性规划(NLP)模型都是非光滑问题,传统优化技术如拟牛顿法或最速下降法无法很好地求解这些问题.不依赖于方向导数计算或逼近的直接搜索法(Direct Searches)或支撑集搜索法GSS(Generating Set Searches)在近几年来...实际工程应用的很多非线性规划(NLP)模型都是非光滑问题,传统优化技术如拟牛顿法或最速下降法无法很好地求解这些问题.不依赖于方向导数计算或逼近的直接搜索法(Direct Searches)或支撑集搜索法GSS(Generating Set Searches)在近几年来得到较多的关注和研究.本文对GSS约定的一些要求,如充分下降和简单下降条件对步长接受准则的影响,如何避免较差的下降方向和迭代步长,为什么必须给定步长收缩因子的上界等进行标注,以便为该领域的进一步深入研究提供参考.展开更多
Conjugate gradient optimization algorithms depend on the search directions with different choices for the parameters in the search directions. In this note, by combining the nice numerical performance of PR and HS met...Conjugate gradient optimization algorithms depend on the search directions with different choices for the parameters in the search directions. In this note, by combining the nice numerical performance of PR and HS methods with the global convergence property of the class of conjugate gradient methods presented by HU and STOREY(1991), a class of new restarting conjugate gradient methods is presented. Global convergences of the new method with two kinds of common line searches, are proved. Firstly, it is shown that, using reverse modulus of continuity function and forcing function, the new method for solving unconstrained optimization can work for a continously dif ferentiable function with Curry-Altman's step size rule and a bounded level set. Secondly, by using comparing technique, some general convergence properties of the new method with other kind of step size rule are established. Numerical experiments show that the new method is efficient by comparing with FR conjugate gradient method.展开更多
基金Supported by the National Natural Science Foundation of China(20710015)the Natural Science Foundation of Jiangsu(YW06037)the Natural Science Foundation of Changzhou Institute of Technology(YN09090)
文摘实际工程应用的很多非线性规划(NLP)模型都是非光滑问题,传统优化技术如拟牛顿法或最速下降法无法很好地求解这些问题.不依赖于方向导数计算或逼近的直接搜索法(Direct Searches)或支撑集搜索法GSS(Generating Set Searches)在近几年来得到较多的关注和研究.本文对GSS约定的一些要求,如充分下降和简单下降条件对步长接受准则的影响,如何避免较差的下降方向和迭代步长,为什么必须给定步长收缩因子的上界等进行标注,以便为该领域的进一步深入研究提供参考.
文摘Conjugate gradient optimization algorithms depend on the search directions with different choices for the parameters in the search directions. In this note, by combining the nice numerical performance of PR and HS methods with the global convergence property of the class of conjugate gradient methods presented by HU and STOREY(1991), a class of new restarting conjugate gradient methods is presented. Global convergences of the new method with two kinds of common line searches, are proved. Firstly, it is shown that, using reverse modulus of continuity function and forcing function, the new method for solving unconstrained optimization can work for a continously dif ferentiable function with Curry-Altman's step size rule and a bounded level set. Secondly, by using comparing technique, some general convergence properties of the new method with other kind of step size rule are established. Numerical experiments show that the new method is efficient by comparing with FR conjugate gradient method.