摘要
梯度法因为其迭代形式简单、所需存储量小,在大规模无约束优化问题中得到了广泛的应用.基于修正的二次近似模型,利用修正的BFGS公式,提出了一个新的近似最优步长.用两个著名的BB步长对此步长进行截断,让其保持在两个BB步长之间。在适当的假设条件下,证明了该方法的全局收敛性。数值实验表明,方法优于一些现有的梯度法.
Gradient method is widely used in large-scale unconstrained optimization problems because of its simple iteration form and small storage required.Based on the modified quadratic approximation model and the modified BFGS formula,a new approximate optimal step size is proposed.Truncate this step with two famous BB steps,keeping it between the two BB steps.Under proper assumptions,the global convergence of the proposed method is proved.Numerical experiments show that the method is superior to some existing gradient methods.
作者
王钰
WANG Yu(School of Mathematical Science,Chongqing Normal University,Chongqing 401331,China)
出处
《数学的实践与认识》
2023年第2期207-215,共9页
Mathematics in Practice and Theory
关键词
梯度法
二次近似模型
BB步长
近似最优步长
全局收敛性
gradient method
quadratic approximation model
BB steplength
approximate optimal stepsize
global convergence