摘要
给定记忆梯度算法搜索方向中的参数一个假设条件,从而确定它的一个取值范围,使其在此范围内取值均能得到目标函数的充分下降方向,由此提出一类新的记忆梯度算法.在去掉迭代点列有界和广义Arm ijo步长搜索下,讨论了算法的全局收敛性,且给出了结合形如共轭梯度法FR,PR,HS的记忆梯度法的修正形式.数值实验表明,新算法比Arm ijo线搜索下的共轭梯度法FR、PR、HS和记忆梯度法更稳定、更有效.
An assumed condition of parameters was given in the memory gradient directions to determine values that these parameters may take. The values range ensure the objective function was sufficient descent,and a new memory gradient algorithm was presented. The convergence was discussed without the generalized Armijo step size rule and the assumed condition that the sequence of iterates was bounded. Combing FR, PR, HS methods with the new method, the modified of the memory gradient algorithm was given. Numerical results showed that the new algorithm was more stable and efficient that conjugate gradient methods FR, PR, HS and Armijo step size rule.
出处
《郑州大学学报(理学版)》
CAS
北大核心
2011年第3期16-18,21,共4页
Journal of Zhengzhou University:Natural Science Edition
基金
山西省自然科学基金资助项目
编号2008011013
关键词
无约束优化
记忆梯度法
广义Armijo线搜索
全局收敛性
unconstrained optimization
memory gradient method
generalized Armijo line search
global convergence