期刊文献+

Wolfe线搜索下一类记忆梯度算法的全局收敛性(英文)

Global Convergence of a Class of Memory Gradient Method with the Wolfe Line Search
下载PDF
导出
摘要 在本文中,首先我们提出一个记忆梯度算法,并讨论其在Wolfe线搜索下的下降性和全局收敛性.进一步地,我们将此算法推广到更一般的情形.最后,我们对这类记忆梯度方法的数值表现进行测试,并与PRP, FR, HS, LS, DY和CD共轭梯度法进行比较,数值结果表明这类算法是有效的. In this paper, we first propose a memory gradient algorithm, and discuss its descent property and global convergence under the Wolfe line search. Further, we extend the algorithm to a more general form. Finally we test the numerical performance and compare the numerical results of this class of algorithms with the PRP, FR, HS, LS, DY and CD conjugate gradient methods, which indicate the class of algorithms proposed in this paper are effective.
作者 陈翠玲 韩彩虹 罗荔龄 陈玉 CHEN Cuiling;HAN Caihong;LUO Liling;CHEN Yu(College of Mathematics and Statistics,Guangzi Normal University,Guilin 541004,China;School of Computing and Information,University of Pittsburgh,Pittsburgh 15238,USA)
出处 《应用数学》 CSCD 北大核心 2018年第4期884-889,共6页 Mathematica Applicata
基金 Supported by the National Natural Science Foundation of China(11761014) Guangxi Natural Science Foundation(2017GXNSFAA198243) Guangxi Basic Ability Improvement Project for the Middle-aged,Young Teachers of Colleges and Universities(2017KY0068,KY2016YB069) Guangxi Higher Education Undergraduate Course Teaching Reform Project(2017JGB147)
关键词 无约束优化 记忆梯度方法 WOLFE线搜索 全局收敛性 Unconstrained optimization Memory gradient method Wolfe line search Global convergence
  • 相关文献

参考文献1

二级参考文献3

共引文献2

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部