摘要
求解无约束优化问题,常用的方法有下降算法,牛顿法,共轭梯度法等。当目标函数为几个光滑函数的和时,一些学者提出并研究了增量梯度算法。其基本思想是循环选取单个函数的负梯度作为迭代方向。增量梯度算法的迭代方向不一定是下降方向,所以不能用下降算法的一维搜索确定步长,因为受限于步长的选择,收敛效率不高。本文结合了下降算法和增量梯度算法的思想,提出了分裂梯度法。简单的说,分裂梯度法循环考虑单个函数的负梯度方向,如果这一方向是下降方向,则选择这一方向为迭代方向;否则选取函数的负梯度方向为迭代方向。最后通过数值实验与最速下降算法、随机下降算法以及增量梯度算法进行对比,结果表明对于某些优化问题,采用分裂梯度法更有效。
The common methods to solve unconstrained optimization problems include the descent algorithm,Newton method and the conjugate gradient method,etc.When the objective function is the sum of several smooth functions,some authors propose and study the incremental gradient algorithm.The algorithm cyclically select the negative gradient of a single function as the iteration direction,which is not necessarily a descent direction.Therefore,incremental gradient algorithm is not a descent algorithm in general.In this paper,the split gradient method was proposed which combines the ideas of the descent algorithm and the incremental gradient algorithm.Finally,some numerical experiments were provided to compare the split gradient method with the steepest descent algorithm,the random descent algorithm and the incremental gradient algorithm,respectively.The numerical results show that the split gradient method is more effective than the others for some optimization problems.
作者
钱晓慧
王湘美
QIAN Xiaohui;WANG Xiangmei(College of Mathematics and Statistics,Guizhou University,Guiyang 550025,China)
出处
《贵州大学学报(自然科学版)》
2019年第6期4-9,共6页
Journal of Guizhou University:Natural Sciences
基金
国家自然科学地区基金项目资助(11661019)
贵州省自然科学基金项目资助(20161039)
关键词
无约束优化
下降算法
增量梯度法
分裂梯度法
Armijo步长规则
unconstrained optimization
descent algorithm
incremental gradient method
split gradient method
Armijo step rule