期刊文献+
共找到8篇文章
< 1 >
每页显示 20 50 100
无约束最优化的Polak-Ribière和Hestenes—Stiefel共轭梯度法的全局收敛性(英文) 被引量:10
1
作者 王长钰 韩继业 王磊 《运筹学学报》 CSCD 2000年第3期1-7,共7页
本文在很弱的条件下得到了关于无约束最优化的Polak—Ribiere和Hestenes-Stiefel共轭梯度法的全局收敛性的新结果,这里 PR方法和HS方法中的参数β_k^(PR)和β_k^HS可以在某个负的区域内取值,这一负... 本文在很弱的条件下得到了关于无约束最优化的Polak—Ribiere和Hestenes-Stiefel共轭梯度法的全局收敛性的新结果,这里 PR方法和HS方法中的参数β_k^(PR)和β_k^HS可以在某个负的区域内取值,这一负的区域与k有关.这些新的收敛性结果改进了文献中已有的结果.数值检验的结果表明了本文中新的 PR方法和 HS方法是相当有效的. 展开更多
关键词 无约最优化 全局收敛性 PR共轭梯度法 HS共轭梯度法
下载PDF
无约束最优化的一个并行算法 被引量:1
2
作者 郑瑾环 《云南师范大学学报(自然科学版)》 2002年第5期1-4,共4页
文章提出一个数值最优化问题的差分方法 ,该法的计算性能略优于拟牛顿法中的 BFGS公式 。
关键词 无约最优化 并行算法 差分方法 并行性 收敛性 计算性能
下载PDF
LIMITED MEMORY BFGS METHOD BY USING LINEAR INDEPENDENT SEARCH DIRECTIONS
3
作者 倪勤 《Transactions of Nanjing University of Aeronautics and Astronautics》 EI 2001年第2期236-239,共4页
The degree of numerical linear independence is proposed and discussed. Based on this linear independence theory, a modified limited memory BFGS method is deve loped. Similar to the standard limited memory method, thi... The degree of numerical linear independence is proposed and discussed. Based on this linear independence theory, a modified limited memory BFGS method is deve loped. Similar to the standard limited memory method, this new method determines the new update by applying the updating formula m times to an initial positive diagonal matrix using the m previous pairs of the change in iteration and gradient. Besides the most recent pair of the change, which guarantees the quadratic termination, the choice of the other ( m -1) pairs of the change in the new method is dependent on the degree of numerical linear independence of previous search directions. In addition, the numerical linear independence theory is further discussed and the computation of the degree of linear independence is simplified. Theoretical and numerical results show that this new modified method improves efficiently the standard limited memory method. 展开更多
关键词 unconstrained optimization limited memory method BFGS method degree of linear independence
下载PDF
无线搜索BFGS公式的改进
4
作者 郑瑾环 《云南师范大学学报(自然科学版)》 2002年第6期1-4,共4页
拟牛顿法中的BFGS公式是非线性数值最优化计算方法中的一个很有效的方法。文章对BFGS公式提出一个修改。修改后的BFGS公式在进行无线搜索迭代时 ,其迭代方向的共轭性得到较好的改进。数值例子表明本文的修改提高了BFGS公式的效率。
关键词 无线搜索BFGS公式 非线性数值 最优化 无约最优化 拟牛顿法 迭代方向
下载PDF
A New Huang Class and Its Properties for Unconstrained Optimization Problems 被引量:1
5
作者 韦增欣 李桥兴 《Journal of Mathematical Research and Exposition》 CSCD 北大核心 2005年第1期64-71,共8页
This paper presents a new class of quasi-Newton methods for solving unconstrained minimization problems. The methods can be regarded as a generalization of Huang class of quasi-Newton methods. We prove that the direct... This paper presents a new class of quasi-Newton methods for solving unconstrained minimization problems. The methods can be regarded as a generalization of Huang class of quasi-Newton methods. We prove that the directions and the iterations generated by the methods of the new class depend only on the parameter p if the exact line searches are made in each steps. 展开更多
关键词 unconstrained optimization quasi-Newton equation quasi-Newton method
下载PDF
A NONMONOTONE CONJUGATE GRADIENT ALGORITHM FOR UNCONSTRAINED OPTIMIZATION 被引量:28
6
《Journal of Systems Science & Complexity》 SCIE EI CSCD 2002年第2期139-145,共7页
Abstract. Conjugate gradient methods are very important methods for unconstrainedoptimization, especially for large scale problems. In this paper, we propose a new conjugategradient method, in which the technique of n... Abstract. Conjugate gradient methods are very important methods for unconstrainedoptimization, especially for large scale problems. In this paper, we propose a new conjugategradient method, in which the technique of nonmonotone line search is used. Under mildassumptions, we prove the global convergence of the method. Some numerical results arealso presented. 展开更多
关键词 Unconstrained optimization conjugate gradient nonmonotone line search global convergence.
原文传递
ON THE CONVERGENCE OF A NEW HYBRID PROJECTION ALGORITHM
7
作者 Qian LIU Changyu WANG Xinmin YANG 《Journal of Systems Science & Complexity》 SCIE EI CSCD 2006年第3期423-430,共8页
For unconstrained optimization, a new hybrid projection algorithm is presented m the paper. This algorithm has some attractive convergence properties. Convergence theory can be obtained under the condition that Δ↓f... For unconstrained optimization, a new hybrid projection algorithm is presented m the paper. This algorithm has some attractive convergence properties. Convergence theory can be obtained under the condition that Δ↓f(x) is uniformly continuous. If Δ↓f(x) is continuously differentiable pseudo-convex, the whole sequence of iterates converges to a solution of the problem without any other assumptions. Furthermore, under appropriate conditions one shows that the sequence of iterates has a cluster-point if and only if Ω* ≠ θ. Numerical examples are given at the end of this paper. 展开更多
关键词 Global convergence hybrid projection unconstrained optimization.
原文传递
Derivation and Global Convergence for Memoryless Non-quasi-Newton Method
8
作者 JIAO Bao Cong YU Jing Jing CHEN Lan Ping 《Journal of Mathematical Research and Exposition》 CSCD 2009年第3期423-433,共11页
In this paper, a new class of memoryless non-quasi-Newton method for solving unconstrained optimization problems is proposed, and the global convergence of this method with inexact line search is proved. Furthermore, ... In this paper, a new class of memoryless non-quasi-Newton method for solving unconstrained optimization problems is proposed, and the global convergence of this method with inexact line search is proved. Furthermore, we propose a hybrid method that mixes both the memoryless non-quasi-Newton method and the memoryless Perry-Shanno quasi-Newton method. The global convergence of this hybrid memoryless method is proved under mild assumptions. The initial results show that these new methods are efficient for the given test problems. Especially the memoryless non-quasi-Newton method requires little storage and computation, so it is able to efficiently solve large scale optimization problems. 展开更多
关键词 memoryless non-quasi-Newton method Wolfe line search global convergence.
下载PDF
上一页 1 下一页 到第
使用帮助 返回顶部