期刊文献+
共找到357篇文章
< 1 2 18 >
每页显示 20 50 100
A New Class of Nonlinear Conjugate Gradient Methods with Global Convergence Properties 被引量:1
1
作者 陈忠 《长江大学学报(自科版)(上旬)》 CAS 2014年第3期I0001-I0003,共3页
非线性共轭梯度法由于其迭代简单和储存量小,且搜索方向不需要满足正割条件,在求解大规模无约束优化问题时占据及其重要的地位.提出了一类新的共轭梯度法,其搜索方向是目标函数的下降方向.若假设目标函数连续可微且梯度满足Lipschitz条... 非线性共轭梯度法由于其迭代简单和储存量小,且搜索方向不需要满足正割条件,在求解大规模无约束优化问题时占据及其重要的地位.提出了一类新的共轭梯度法,其搜索方向是目标函数的下降方向.若假设目标函数连续可微且梯度满足Lipschitz条件,线性搜索满足Wolfe原则,讨论了所设计算法的全局收敛性. 展开更多
关键词 摘要 编辑部 编辑工作 读者
下载PDF
A Descent Gradient Method and Its Global Convergence
2
作者 LIU Jin-kui 《Chinese Quarterly Journal of Mathematics》 CSCD 2014年第1期142-150,共9页
Y Liu and C Storey(1992)proposed the famous LS conjugate gradient method which has good numerical results.However,the LS method has very weak convergence under the Wolfe-type line search.In this paper,we give a new de... Y Liu and C Storey(1992)proposed the famous LS conjugate gradient method which has good numerical results.However,the LS method has very weak convergence under the Wolfe-type line search.In this paper,we give a new descent gradient method based on the LS method.It can guarantee the sufficient descent property at each iteration and the global convergence under the strong Wolfe line search.Finally,we also present extensive preliminary numerical experiments to show the efficiency of the proposed method by comparing with the famous PRP^+method. 展开更多
关键词 unconstrained optimization conjugate gradient method strong Wolfe line search sufficient descent property global convergence
下载PDF
Almost Sure Convergence of Proximal Stochastic Accelerated Gradient Methods
3
作者 Xin Xiang Haoming Xia 《Journal of Applied Mathematics and Physics》 2024年第4期1321-1336,共16页
Proximal gradient descent and its accelerated version are resultful methods for solving the sum of smooth and non-smooth problems. When the smooth function can be represented as a sum of multiple functions, the stocha... Proximal gradient descent and its accelerated version are resultful methods for solving the sum of smooth and non-smooth problems. When the smooth function can be represented as a sum of multiple functions, the stochastic proximal gradient method performs well. However, research on its accelerated version remains unclear. This paper proposes a proximal stochastic accelerated gradient (PSAG) method to address problems involving a combination of smooth and non-smooth components, where the smooth part corresponds to the average of multiple block sums. Simultaneously, most of convergence analyses hold in expectation. To this end, under some mind conditions, we present an almost sure convergence of unbiased gradient estimation in the non-smooth setting. Moreover, we establish that the minimum of the squared gradient mapping norm arbitrarily converges to zero with probability one. 展开更多
关键词 Proximal Stochastic Accelerated method Almost Sure convergence Composite optimization Non-Smooth optimization Stochastic optimization Accelerated gradient method
下载PDF
A NEW CONJUGATE GRADIENT METHOD AND ITS GLOBAL CONVERGENCE PROPERTIES 被引量:2
4
作者 LI Zhengfeng CHEN Jing DENG Naiyang(Division of Basic Sciences, China Agricultural University East Campus, Beijing 100083, China) 《Systems Science and Mathematical Sciences》 SCIE EI CSCD 1998年第1期53-60,共8页
This paper presents a new conjugate gradient method for unconstrained opti-mization. This method reduces to the Polak-Ribiere-Polyak method when line searches areexact. But their performances are differellt in the cas... This paper presents a new conjugate gradient method for unconstrained opti-mization. This method reduces to the Polak-Ribiere-Polyak method when line searches areexact. But their performances are differellt in the case of inexact line search. By a simpleexample, we show that the Wolf e conditions do not ensure that the present method and thePolak- Ribiere- Polyak method will pro duce descent direct i0ns even u nder t h e ass umpt ionthat the objective function is Strictly convex. This result contradicts the F0lk axiom thatthe Polak-Ribiere-Polyak with the Wolf e line search should find the minimizer of a strictlyconvex objective function. Finally, we show that there are two ways to improve the newmethod such that it is globally convergent. 展开更多
关键词 CONJUGATE gradient method global convergence UNCONSTRAINED optimization line searches.
原文传递
A Globally Convergent Polak-Ribiere-Polyak Conjugate Gradient Method with Armijo-Type Line Search 被引量:11
5
作者 Gaohang Yu Lutai Guan Zengxin Wei 《Numerical Mathematics A Journal of Chinese Universities(English Series)》 SCIE 2006年第4期357-366,共10页
In this paper, we propose a globally convergent Polak-Ribiere-Polyak (PRP) conjugate gradient method for nonconvex minimization of differentiable functions by employing an Armijo-type line search which is simpler and ... In this paper, we propose a globally convergent Polak-Ribiere-Polyak (PRP) conjugate gradient method for nonconvex minimization of differentiable functions by employing an Armijo-type line search which is simpler and less demanding than those defined in [4,10]. A favorite property of this method is that we can choose the initial stepsize as the one-dimensional minimizer of a quadratic modelΦ(t):= f(xk)+tgkTdk+(1/2) t2dkTQkdk, where Qk is a positive definite matrix that carries some second order information of the objective function f. So, this line search may make the stepsize tk more easily accepted. Preliminary numerical results show that this method is efficient. 展开更多
关键词 非约束最优化 共轭梯度法 整体收敛 可微函数
下载PDF
A New Nonlinear Conjugate Gradient Method for Unconstrained Optimization Problems 被引量:1
6
作者 LIU Jin-kui WANG Kai-rong +1 位作者 SONG Xiao-qian DU Xiang-lin 《Chinese Quarterly Journal of Mathematics》 CSCD 2010年第3期444-450,共7页
在这份报纸,一有效结合坡度方法被给解决一般非强迫的优化问题,它能保证有强壮的沃尔夫线的足够的降下性质和全球集中寻找新方法由与 PRP+method 作比较有效、静止的 conditions.Numerical 结果表演,它能广泛地因此在科学计算被使用。
关键词 非强迫的优化 结合坡度方法 强壮的沃尔夫线搜索 足够的降下性质 全球集中
下载PDF
A New Descent Nonlinear Conjugate Gradient Method for Unconstrained Optimization
7
作者 Hao Fan Zhibin Zhu Anwa Zhou 《Applied Mathematics》 2011年第9期1119-1123,共5页
In this paper, a new nonlinear conjugate gradient method is proposed for large-scale unconstrained optimization. The sufficient descent property holds without any line searches. We use some steplength technique which ... In this paper, a new nonlinear conjugate gradient method is proposed for large-scale unconstrained optimization. The sufficient descent property holds without any line searches. We use some steplength technique which ensures the Zoutendijk condition to be held, this method is proved to be globally convergent. Finally, we improve it, and do further analysis. 展开更多
关键词 Large Scale UNCONSTRAINED optimization CONJUGATE gradient method SUFFICIENT DESCENT Property globally CONVERGENT
下载PDF
A hybrid conjugate gradient method for optimization problems
8
作者 Xiangrong Li Xupei Zhao 《Natural Science》 2011年第1期85-90,共6页
A hybrid method of the Polak-Ribière-Polyak (PRP) method and the Wei-Yao-Liu (WYL) method is proposed for unconstrained optimization pro- blems, which possesses the following properties: i) This method inherits a... A hybrid method of the Polak-Ribière-Polyak (PRP) method and the Wei-Yao-Liu (WYL) method is proposed for unconstrained optimization pro- blems, which possesses the following properties: i) This method inherits an important property of the well known PRP method: the tendency to turn towards the steepest descent direction if a small step is generated away from the solution, preventing a sequence of tiny steps from happening;ii) The scalar holds automatically;iii) The global convergence with some line search rule is established for nonconvex functions. Numerical results show that the method is effective for the test problems. 展开更多
关键词 LINE SEARCH UNCONSTRAINED optimization CONJUGATE gradient method global convergence
下载PDF
A TRUST REGION ALGORITHM FOR CONSTRAINED NONSMOOTH OPTIMIZATION PROBLEMS 被引量:2
9
作者 Yu-fei Yang Dong-hui Li 《Journal of Computational Mathematics》 SCIE EI CSCD 2001年第4期357-364,共8页
Presents an inexact trust region algorithm for solving constrained nonsmooth optimization problems. Global convergence of the algorithm; Assumptions on the algorithm; Relation between critical points and stationary po... Presents an inexact trust region algorithm for solving constrained nonsmooth optimization problems. Global convergence of the algorithm; Assumptions on the algorithm; Relation between critical points and stationary points. 展开更多
关键词 trust region method nonsmooth function constrained optimization global convergence
原文传递
A modified three–term conjugate gradient method with sufficient descent property 被引量:1
10
作者 Saman Babaie–Kafaki 《Applied Mathematics(A Journal of Chinese Universities)》 SCIE CSCD 2015年第3期263-272,共10页
A hybridization of the three–term conjugate gradient method proposed by Zhang et al. and the nonlinear conjugate gradient method proposed by Polak and Ribi`ere, and Polyak is suggested. Based on an eigenvalue analysi... A hybridization of the three–term conjugate gradient method proposed by Zhang et al. and the nonlinear conjugate gradient method proposed by Polak and Ribi`ere, and Polyak is suggested. Based on an eigenvalue analysis, it is shown that search directions of the proposed method satisfy the sufficient descent condition, independent of the line search and the objective function convexity. Global convergence of the method is established under an Armijo–type line search condition. Numerical experiments show practical efficiency of the proposed method. 展开更多
关键词 unconstrained optimization conjugate gradient method eigenvalue sufficient descent condition global convergence
下载PDF
A SQP METHOD FOR MINIMIZING A CLASS OF NONSMOOTH FUNCTIONS
11
作者 孙小玲 张连生 白延琴 《Numerical Mathematics A Journal of Chinese Universities(English Series)》 SCIE 1996年第2期139-146,共8页
In this paper,we present a successive quadratic programming(SQP)method for minimizing a class of nonsmooth functions,which are the sum of a convex function and a nonsmooth composite function.The method generates new i... In this paper,we present a successive quadratic programming(SQP)method for minimizing a class of nonsmooth functions,which are the sum of a convex function and a nonsmooth composite function.The method generates new iterations by using the Armijo-type line search technique after having found the search directions.Global convergence property is established under mild assumptions.Numerical results are also offered. 展开更多
关键词 nonsmooth optimization SQP method global convergence.
下载PDF
Modified LS Method for Unconstrained Optimization
12
作者 Jinkui Liu Li Zheng 《Applied Mathematics》 2011年第6期779-782,共4页
In this paper, a new conjugate gradient formula and its algorithm for solving unconstrained optimization problems are proposed. The given formula satisfies with satisfying the descent condition. Under the Grippo-Lucid... In this paper, a new conjugate gradient formula and its algorithm for solving unconstrained optimization problems are proposed. The given formula satisfies with satisfying the descent condition. Under the Grippo-Lucidi line search, the global convergence property of the given method is discussed. The numerical results show that the new method is efficient for the given test problems. 展开更多
关键词 UNCONSTRAINED optimization CONJUGATE gradient method Grippo-Lucidi Line SEARCH global convergence
下载PDF
A CLASSOF NONMONOTONE CONJUGATE GRADIENT METHODSFOR NONCONVEX FUNCTIONS
13
作者 LiuYun WeiZengxin 《Applied Mathematics(A Journal of Chinese Universities)》 SCIE CSCD 2002年第2期208-214,共7页
This paper discusses the global convergence of a class of nonmonotone conjugate gra- dient methods(NM methods) for nonconvex object functions.This class of methods includes the nonmonotone counterpart of modified Po... This paper discusses the global convergence of a class of nonmonotone conjugate gra- dient methods(NM methods) for nonconvex object functions.This class of methods includes the nonmonotone counterpart of modified Polak- Ribière method and modified Hestenes- Stiefel method as special cases 展开更多
关键词 nonmonotone conjugate gradient method nonmonotone line search global convergence unconstrained optimization.
下载PDF
一种近似BFGS的自适应双参数共轭梯度法
14
作者 李向利 莫元健 梅建平 《应用数学》 北大核心 2024年第1期89-99,共11页
为了更加有效的求解大规模无约束优化问题,本文基于自调比无记忆BFGS拟牛顿法,提出一个自适应双参数共轭梯度法,设计的搜索方向满足充分下降性,在一般假设和标准Wolfe线搜索准则下,证明该方法具有全局收敛性,数值实验结果证明提出的新... 为了更加有效的求解大规模无约束优化问题,本文基于自调比无记忆BFGS拟牛顿法,提出一个自适应双参数共轭梯度法,设计的搜索方向满足充分下降性,在一般假设和标准Wolfe线搜索准则下,证明该方法具有全局收敛性,数值实验结果证明提出的新算法是有效的. 展开更多
关键词 大规模无约束优化 共轭梯度法 WOLFE线搜索 全局收敛性
下载PDF
一种WYL型谱共轭梯度法的全局收敛性
15
作者 蔡宇 周光辉 《数学物理学报(A辑)》 CSCD 北大核心 2024年第1期173-184,共12页
为解决大规模无约束优化问题,该文结合WYL共轭梯度法和谱共轭梯度法,给出了一种WYL型谱共轭梯度法.在不依赖于任何线搜索的条件下,该方法产生的搜索方向均满足充分下降性,且在强Wolfe线搜索下证明了该方法的全局收敛性.与WYL共轭梯度法... 为解决大规模无约束优化问题,该文结合WYL共轭梯度法和谱共轭梯度法,给出了一种WYL型谱共轭梯度法.在不依赖于任何线搜索的条件下,该方法产生的搜索方向均满足充分下降性,且在强Wolfe线搜索下证明了该方法的全局收敛性.与WYL共轭梯度法的收敛性相比,WYL型谱共轭梯度法推广了线搜索中参数σ的取值范围.最后,相应的数值结果表明了该方法是有效的. 展开更多
关键词 无约束优化 谱共轭梯度法 强Wolfe线搜索 全局收敛性
下载PDF
一种放松条件的 Hager-Zhang 共轭梯度算法
16
作者 赵倩倩 申远 《许昌学院学报》 CAS 2024年第2期17-21,共5页
共轭梯度(CG)算法是求解无约束二次优化问题的一种经典算法,但其无法求解非二次问题.为解决该问题,在Hager-Zhang共轭梯度下降算法的基础上引入一个新的参数,设计出一种放松条件的CG下降算法.该算法在每次迭代中不会储存雅可比矩阵,因... 共轭梯度(CG)算法是求解无约束二次优化问题的一种经典算法,但其无法求解非二次问题.为解决该问题,在Hager-Zhang共轭梯度下降算法的基础上引入一个新的参数,设计出一种放松条件的CG下降算法.该算法在每次迭代中不会储存雅可比矩阵,因此能够解决大规模非光滑问题.结果表明,该算法不仅满足全局收敛性且数值表现优异,还可求解单调约束方程.因此它比其他CG算法有更强的适应性. 展开更多
关键词 无约束优化 共轭梯度法 全局收敛性 单调方程
下载PDF
标准Wolfe线搜索下改进的HS共轭梯度法
17
作者 王森森 郑宗剑 韩信 《四川文理学院学报》 2024年第2期50-55,共6页
通过对现有的HS共轭梯度法进行修正,提出一个具有下降性质的改进型HS共轭梯度法,该算法的下降性质得到论证.在标准Wolfe线搜索条件下,证明了改进的HS算法具有全局收敛性.最后,通过数值实验结果的对比,发现新算法数值效果是优异的.
关键词 无约束优化 共轭梯度法 标准Wolfe线搜索 全局收敛性
下载PDF
基于强Wolfe线搜索下的混合型谱共轭梯度法
18
作者 古恒洋 胡鹏 《东莞理工学院学报》 2024年第3期38-42,共5页
共轭梯度法具有储存小、计算快的优点。基于PRP类的共轭参数类型,设计了一种新的具有凸组合的混合型共轭参数。该参数不仅具有FR法的良好收敛性质,而且还具有PRP类方法的良好数值结果。与此同时在新的参数基础上设计了相应的谱共轭参数... 共轭梯度法具有储存小、计算快的优点。基于PRP类的共轭参数类型,设计了一种新的具有凸组合的混合型共轭参数。该参数不仅具有FR法的良好收敛性质,而且还具有PRP类方法的良好数值结果。与此同时在新的参数基础上设计了相应的谱共轭参数。用强Wolfe线搜索条件证明了该算法是具有全局收敛性的。最后通过对CUTEr测试集里面的问题进行数值实验,结果发现该算法具有较好的数值效果。 展开更多
关键词 无约束优化 混合型共轭参数 谱共轭梯度法 全局收敛性
下载PDF
带有延迟步长的循环BB梯度法
19
作者 杨奕涵 《东莞理工学院学报》 2024年第1期1-6,共6页
梯度法是求解大规模无约束优化问题的常用方法。将求解二次函数极小化问题的步长推广至一般无约束优化问题,通过使用延迟一步以及循环梯度法的思想,提出了循环Barzilai-Borwein梯度法(BB梯度法),并结合Zhang-Hager非单调线搜索技术,给... 梯度法是求解大规模无约束优化问题的常用方法。将求解二次函数极小化问题的步长推广至一般无约束优化问题,通过使用延迟一步以及循环梯度法的思想,提出了循环Barzilai-Borwein梯度法(BB梯度法),并结合Zhang-Hager非单调线搜索技术,给出了求解一般无约束优化问题的循环BB梯度算法—CBBGM算法。在适当的假设下,CBBGM算法是全局收敛的,且目标函数为强凸函数时,该算法具有线性收敛速度。数值试验表明,与现有的方法相比,所提出的方法在计算上更高效。 展开更多
关键词 Barzilai-Borwein梯度法 无约束优化问题 Zhang-Hager非单调线搜索 全局收敛性
下载PDF
Global Convergence of a Modified Spectral CD Conjugate Gradient Method 被引量:7
20
作者 Wei CAO Kai Rong WANG Yi Li WANG 《Journal of Mathematical Research and Exposition》 CSCD 2011年第2期261-268,共8页
In this paper,we present a new nonlinear modified spectral CD conjugate gradient method for solving large scale unconstrained optimization problems.The direction generated by the method is a descent direction for the ... In this paper,we present a new nonlinear modified spectral CD conjugate gradient method for solving large scale unconstrained optimization problems.The direction generated by the method is a descent direction for the objective function,and this property depends neither on the line search rule,nor on the convexity of the objective function.Moreover,the modified method reduces to the standard CD method if line search is exact.Under some mild conditions,we prove that the modified method with line search is globally convergent even if the objective function is nonconvex.Preliminary numerical results show that the proposed method is very promising. 展开更多
关键词 unconstrained optimization conjugate gradient method armijo-type line search global convergence
下载PDF
上一页 1 2 18 下一页 到第
使用帮助 返回顶部