期刊文献+
共找到4篇文章
< 1 >
每页显示 20 50 100
A new primal-dual interior-point algorithm for convex quadratic optimization 被引量:9
1
作者 王国强 白延琴 +1 位作者 刘勇 张敏 《Journal of Shanghai University(English Edition)》 CAS 2008年第3期189-196,共8页
In this paper, a new primal-dual interior-point algorithm for convex quadratic optimization (CQO) based on a kernel function is presented. The proposed function has some properties that are easy for checking. These ... In this paper, a new primal-dual interior-point algorithm for convex quadratic optimization (CQO) based on a kernel function is presented. The proposed function has some properties that are easy for checking. These properties enable us to improve the polynomial complexity bound of a large-update interior-point method (IPM) to O(√n log nlog n/e), which is the currently best known polynomial complexity bound for the algorithm with the large-update method. Numerical tests were conducted to investigate the behavior of the algorithm with different parameters p, q and θ, where p is the growth degree parameter, q is the barrier degree of the kernel function and θ is the barrier update parameter. 展开更多
关键词 convex quadratic optimization (CQO) interior-point methods (IPMs) large-update method polynomial complexity
下载PDF
A Large-update Interior-point Algorithm for Convex Quadratic Semi-definite Optimization Based on a New Kernel Function 被引量:9
2
作者 Ming Wang ZHANG 《Acta Mathematica Sinica,English Series》 SCIE CSCD 2012年第11期2313-2328,共16页
In this paper, we present a large-update interior-point algorithm for convex quadratic semi-definite optimization based on a new kernel function. The proposed function is strongly convex. It is not self-regular functi... In this paper, we present a large-update interior-point algorithm for convex quadratic semi-definite optimization based on a new kernel function. The proposed function is strongly convex. It is not self-regular function and also the usual logarithmic function. The goal of this paper is to investigate such a kernel function and show that the algorithm has favorable complexity bound in terms of the elegant analytic properties of the kernel function. The complexity bound is shown to be O(√n(logn)2 log e/n). This bound is better than that by the classical primal-dual interior-point methods based on logarithmic barrier function and in optimization fields. Some computational results recent kernel functions introduced by some authors have been provided. 展开更多
关键词 convex quadratic semi-definite optimization kernel function interior-point algorithm^large-update method complexity
原文传递
A Wide Neighborhood Interior-Point Algorithm for Convex Quadratic Semidefinite Optimization
3
作者 Mohammad Pirhaji Maryam Zangiabadi +2 位作者 Hossien Mansouri Ali Nakhaei Ali Shojaeifard 《Journal of the Operations Research Society of China》 EI CSCD 2020年第1期145-164,共20页
In this paper,we propose an interior-point algorithm based on a wide neighborhood for convex quadratic semidefinite optimization problems.Using the Nesterov–Todd direction as the search direction,we prove the converg... In this paper,we propose an interior-point algorithm based on a wide neighborhood for convex quadratic semidefinite optimization problems.Using the Nesterov–Todd direction as the search direction,we prove the convergence analysis and obtain the polynomial complexity bound of the proposed algorithm.Although the algorithm belongs to the class of large-step interior-point algorithms,its complexity coincides with the best iteration bound for short-step interior-point algorithms.The algorithm is also implemented to demonstrate that it is efficient. 展开更多
关键词 convex quadratic semidefinite optimization Feasible interior-point method Wide neighborhood Polynomial complexity
原文传递
TWO NOVEL GRADIENT METHODS WITH OPTIMAL STEP SIZES
4
作者 Harry Oviedo Oscar Dalmau Rafael Herrera 《Journal of Computational Mathematics》 SCIE CSCD 2021年第3期375-391,共17页
In this work we introduce two new Barzilai and Borwein-like steps sizes for the classical gradient method for strictly convex quadratic optimization problems.The proposed step sizes employ second-order information in ... In this work we introduce two new Barzilai and Borwein-like steps sizes for the classical gradient method for strictly convex quadratic optimization problems.The proposed step sizes employ second-order information in order to obtain faster gradient-type methods.Both step sizes are derived from two unconstrained optimization models that involve approximate information of the Hessian of the objective function.A convergence analysis of the proposed algorithm is provided.Some numerical experiments are performed in order to compare the efficiency and effectiveness of the proposed methods with similar methods in the literature.Experimentally,it is observed that our proposals accelerate the gradient method at nearly no extra computational cost,which makes our proposal a good alternative to solve large-scale problems. 展开更多
关键词 Gradient methods convex quadratic optimization Hessian spectral properties Steplength selection
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部