期刊文献+
共找到14篇文章
< 1 >
每页显示 20 50 100
SMOOTHING NEWTON ALGORITHM FOR THE CIRCULAR CONE PROGRAMMING WITH A NONMONOTONE LINE SEARCH 被引量:8
1
作者 迟晓妮 韦洪锦 +1 位作者 万仲平 朱志斌 《Acta Mathematica Scientia》 SCIE CSCD 2017年第5期1262-1280,共19页
In this paper, we present a nonmonotone smoothing Newton algorithm for solving the circular cone programming(CCP) problem in which a linear function is minimized or maximized over the intersection of an affine space w... In this paper, we present a nonmonotone smoothing Newton algorithm for solving the circular cone programming(CCP) problem in which a linear function is minimized or maximized over the intersection of an affine space with the circular cone. Based on the relationship between the circular cone and the second-order cone(SOC), we reformulate the CCP problem as the second-order cone problem(SOCP). By extending the nonmonotone line search for unconstrained optimization to the CCP, a nonmonotone smoothing Newton method is proposed for solving the CCP. Under suitable assumptions, the proposed algorithm is shown to be globally and locally quadratically convergent. Some preliminary numerical results indicate the effectiveness of the proposed algorithm for solving the CCP. 展开更多
关键词 circular cone programming second-order cone programming nonmonotone line search smoothing Newton method local quadratic convergence
下载PDF
EQUILIBRIUM ALGORITHMS WITH NONMONOTONE LINE SEARCH TECHNIQUE FOR SOLVING THE TRAFFIC ASSIGNMENT PROBLEMS 被引量:1
2
作者 ZHAO Hui GAO Ziyou 《Journal of Systems Science & Complexity》 SCIE EI CSCD 2005年第4期543-555,共13页
This paper presents a unified bination algorithms (such as FrankWolfe problems. Global convergence results are framework of the nonmonotone convex comAlgorithm) for solving the traffic assignment established under m... This paper presents a unified bination algorithms (such as FrankWolfe problems. Global convergence results are framework of the nonmonotone convex comAlgorithm) for solving the traffic assignment established under mild conditions. The line search procedure used in our algorithm includes the nonmonotone Armijo rule, the non- monotone Goldstein rule and the nonmonotone Wolfe rule as special cases. So, the new algorithm can be viewed as a generalization of the regular convex combination algorithm. 展开更多
关键词 TRAFFIC convex combination algorithm nonmonotone line search global convergence.
原文传递
A Nonmonotone Line Search Based Algorithm for Distribution Center Location Selected
3
作者 Zhu-cui JING Meng-gang LI Chuan-long WANG 《Acta Mathematicae Applicatae Sinica》 SCIE CSCD 2014年第3期699-706,共8页
The minimax optimization model introduced in this paper is an important model which has received some attention over the past years. In this paper, the application of minimax model on how to select the distribution ce... The minimax optimization model introduced in this paper is an important model which has received some attention over the past years. In this paper, the application of minimax model on how to select the distribution center location is first introduced. Then a new algorithm with nonmonotone line search to solve the non-decomposable minimax optimization is proposed. We prove that the new algorithm is global Convergent. Numerical results show the proposed algorithm is effective. 展开更多
关键词 non-decomposable MINIMAX nonmonotone line search global convergence
原文传递
The Global Convergence of Self-Scaling BFGS Algorithm with Nonmonotone Line Search for Unconstrained Nonconvex Optimization Problems
4
作者 Hong Xia YIN Dong Lei DU 《Acta Mathematica Sinica,English Series》 SCIE CSCD 2007年第7期1233-1240,共8页
The self-scaling quasi-Newton method solves an unconstrained optimization problem by scaling the Hessian approximation matrix before it is updated at each iteration to avoid the possible large eigenvalues in the Hessi... The self-scaling quasi-Newton method solves an unconstrained optimization problem by scaling the Hessian approximation matrix before it is updated at each iteration to avoid the possible large eigenvalues in the Hessian approximation matrices of the objective function. It has been proved in the literature that this method has the global and superlinear convergence when the objective function is convex (or even uniformly convex). We propose to solve unconstrained nonconvex optimization problems by a self-scaling BFGS algorithm with nonmonotone linear search. Nonmonotone line search has been recognized in numerical practices as a competitive approach for solving large-scale nonlinear problems. We consider two different nonmonotone line search forms and study the global convergence of these nonmonotone self-scale BFGS algorithms. We prove that, under some weaker condition than that in the literature, both forms of the self-scaling BFGS algorithm are globally convergent for unconstrained nonconvex optimization problems. 展开更多
关键词 nonmonotone line search self-scaling BFGS method global convergence
原文传递
A MIXED SUPERLINEARLY CONVERGENT ALGORITHM WITH NONMONOTONE SEARCH FOR CONSTRAINED OPTIMIZATIONS
5
作者 XuYifan WangWei 《Applied Mathematics(A Journal of Chinese Universities)》 SCIE CSCD 2000年第2期211-219,共9页
In the paper, a new mixed algorithm combined with schemes of nonmonotone line search, the systems of linear equations for higher order modification and sequential quadratic programming for constrained optimizations is... In the paper, a new mixed algorithm combined with schemes of nonmonotone line search, the systems of linear equations for higher order modification and sequential quadratic programming for constrained optimizations is presented. Under some weaker assumptions,without strict complementary condition, the algorithm is globally and superlinearly convergent. 展开更多
关键词 Strict complementary condition nonmonotone line search constrained optimization convergence.
全文增补中
Convergence Analysis on a Class of Nonmonotone Conjugate Gradient Methods without Sufficient Decrease Condition 被引量:1
6
作者 DUShou-qiang CHENYuan-yuan 《Chinese Quarterly Journal of Mathematics》 CSCD 2004年第2期142-145,共4页
In [3] Liu et al. investigated global convergence of conjugate gradient methods. In that paper they allowed βκ to be selected in a wider range and the global convergence of the corresponding algorithm without suffic... In [3] Liu et al. investigated global convergence of conjugate gradient methods. In that paper they allowed βκ to be selected in a wider range and the global convergence of the corresponding algorithm without sufficient decrease condition was proved. This paper investigates global convergence of nonmonotone conjugate gradient method under the same conditions. 展开更多
关键词 nonmonotone conjugate gradient global convergence nonmonotone line search
下载PDF
A CLASSOF NONMONOTONE CONJUGATE GRADIENT METHODSFOR NONCONVEX FUNCTIONS
7
作者 LiuYun WeiZengxin 《Applied Mathematics(A Journal of Chinese Universities)》 SCIE CSCD 2002年第2期208-214,共7页
This paper discusses the global convergence of a class of nonmonotone conjugate gra- dient methods(NM methods) for nonconvex object functions.This class of methods includes the nonmonotone counterpart of modified Po... This paper discusses the global convergence of a class of nonmonotone conjugate gra- dient methods(NM methods) for nonconvex object functions.This class of methods includes the nonmonotone counterpart of modified Polak- Ribière method and modified Hestenes- Stiefel method as special cases 展开更多
关键词 nonmonotone conjugate gradient method nonmonotone line search global convergence unconstrained optimization.
下载PDF
An Alternating Direction Nonmonotone Approximate Newton Algorithm for Inverse Problems
8
作者 Zhuhan Zhang Zhensheng Yu Xinyue Gan 《Journal of Applied Mathematics and Physics》 2016年第11期2069-2078,共11页
In this paper, an alternating direction nonmonotone approximate Newton algorithm (ADNAN) based on nonmonotone line search is developed for solving inverse problems. It is shown that ADNAN converges to a solution of th... In this paper, an alternating direction nonmonotone approximate Newton algorithm (ADNAN) based on nonmonotone line search is developed for solving inverse problems. It is shown that ADNAN converges to a solution of the inverse problems and numerical results provide the effectiveness of the proposed algorithm. 展开更多
关键词 nonmonotone line search Alternating Direction Method Bound-Constraints Newton Method
下载PDF
ANonmonotone Projected Gradient Method for Multiobjective Problems on Convex Sets
9
作者 Gabrie Anibal Carrizo Nadia Soledad Fazzio Maria Laura Schuverdt 《Journal of the Operations Research Society of China》 EI CSCD 2024年第2期410-427,共18页
In this work we consider an extension of the classical scalar-valued projected gradient method for multiobjective problems on convex sets.As in Fazzio et al.(Optim Lett 13:1365-1379,2019)a parameter which controls the... In this work we consider an extension of the classical scalar-valued projected gradient method for multiobjective problems on convex sets.As in Fazzio et al.(Optim Lett 13:1365-1379,2019)a parameter which controls the step length is considered and an updating rule based on the spectral gradient method from the scalar case is proposed.In the present paper,we consider an extension of the traditional nonmonotone approach of Grippo et al.(SIAM J Numer Anal 23:707-716,1986)based on the maximum of some previous function values as suggested in Mita et al.(J Glob Optim 75:539-559,2019)for unconstrained multiobjective optimization problems.We prove the accumulation points of sequences generated by the proposed algorithm,if they exist,are stationary points of the original problem.Numerical experiments are reported. 展开更多
关键词 Multiobjective optimization Projected gradient methods nonmonotone line search Global convergence
原文传递
NEW LIMITED MEMORY SYMMETRIC RANK ONE ALGORITHM FOR LARGE-SCALE UNCONSTRAINED OPTIMIZATION
10
作者 刘浩 倪勤 《Transactions of Nanjing University of Aeronautics and Astronautics》 EI 2008年第3期235-239,共5页
A new limited memory symmetric rank one algorithm is proposed. It combines a modified self-scaled symmetric rank one (SSR1) update with the limited memory and nonmonotone line search technique. In this algorithm, th... A new limited memory symmetric rank one algorithm is proposed. It combines a modified self-scaled symmetric rank one (SSR1) update with the limited memory and nonmonotone line search technique. In this algorithm, the descent search direction is generated by inverse limited memory SSR1 update, thus simplifying the computation. Numerical comparison of the algorithm and the famous limited memory BFGS algorithm is given. Comparison results indicate that the new algorithm can process a kind of large-scale unconstrained optimization problems. 展开更多
关键词 optimization large scale systems symmetric rank one update nonmonotone line search limitedmemory algorithm
下载PDF
A NONMONOTONE CONJUGATE GRADIENT ALGORITHM FOR UNCONSTRAINED OPTIMIZATION 被引量:28
11
《Journal of Systems Science & Complexity》 SCIE EI CSCD 2002年第2期139-145,共7页
Abstract. Conjugate gradient methods are very important methods for unconstrainedoptimization, especially for large scale problems. In this paper, we propose a new conjugategradient method, in which the technique of n... Abstract. Conjugate gradient methods are very important methods for unconstrainedoptimization, especially for large scale problems. In this paper, we propose a new conjugategradient method, in which the technique of nonmonotone line search is used. Under mildassumptions, we prove the global convergence of the method. Some numerical results arealso presented. 展开更多
关键词 Unconstrained optimization conjugate gradient nonmonotone line search global convergence.
原文传递
A New QP-free Algorithm Without a Penalty Function or a Filter for Nonlinear Semidefinite Programming 被引量:1
12
作者 Jian-ling LI Zhen-ping YANG +1 位作者 Jia-qi WU Jin-bao JIAN 《Acta Mathematicae Applicatae Sinica》 SCIE CSCD 2020年第3期714-736,共23页
In this paper,we present a QP-free algorithm without a penalty function or a filter for nonlinear semidefinite programming.At each iteration,two systems of linear equations with the same coefficient matrix are solved ... In this paper,we present a QP-free algorithm without a penalty function or a filter for nonlinear semidefinite programming.At each iteration,two systems of linear equations with the same coefficient matrix are solved to determine search direction;the nonmonotone line search ensures that the objective function or constraint violation function is sufficiently reduced.There is no feasibility restoration phase in our algorithm,which is necessary for traditional filter methods.The proposed algorithm is globally convergent under some mild conditions.Preliminary numerical results indicate that the proposed algorithm is comparable. 展开更多
关键词 nonlinear semidefinite programming QP-free penalty-free nonmonotone line search global convergence
原文传递
Cyclic Gradient Methods for Unconstrained Optimization
13
作者 Ya Zhang Cong Sun 《Journal of the Operations Research Society of China》 EI CSCD 2024年第3期809-828,共20页
Gradient method is popular for solving large-scale problems.In this work,the cyclic gradient methods for quadratic function minimization are extended to general smooth unconstrained optimization problems.Combining wit... Gradient method is popular for solving large-scale problems.In this work,the cyclic gradient methods for quadratic function minimization are extended to general smooth unconstrained optimization problems.Combining with nonmonotonic line search,we prove its global convergence.Furthermore,the proposed algorithms have sublinear convergence rate for general convex functions,and R-linear convergence rate for strongly convex problems.Numerical experiments show that the proposed methods are effective compared to the state of the arts. 展开更多
关键词 Gradient method Unconstrained optimization Nonmonotonic line search Global convergence
原文传递
A NEW ADAPTIVE SUBSPACE MINIMIZATION THREE-TERM CONJUGATE GRADIENT ALGORITHM FOR UNCONSTRAINED OPTIMIZATION
14
作者 Keke Zhang Hongwei Liu Zexian Liu 《Journal of Computational Mathematics》 SCIE CSCD 2021年第2期159-177,共19页
A new adaptive subspace minimization three-term conjugate gradient algorithm with nonmonotone line search is introduced and analyzed in this paper.The search directions are computed by minimizing a quadratic approxima... A new adaptive subspace minimization three-term conjugate gradient algorithm with nonmonotone line search is introduced and analyzed in this paper.The search directions are computed by minimizing a quadratic approximation of the objective function on special subspaces,and we also proposed an adaptive rule for choosing different searching directions at each iteration.We obtain a significant conclusion that the each choice of the search directions satisfies the sufficient descent condition.With the used nonmonotone line search,we prove that the new algorithm is globally convergent for general nonlinear functions under some mild assumptions.Numerical experiments show that the proposed algorithm is promising for the given test problem set. 展开更多
关键词 Conjugate gradient method nonmonotone line search Subspace minimization Sufficient descent condition Global convergence
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部