期刊文献+
共找到4篇文章
< 1 >
每页显示 20 50 100
Accelerated Primal-Dual Projection Neurodynamic Approach With Time Scaling for Linear and Set Constrained Convex Optimization Problems
1
作者 You Zhao Xing He +1 位作者 Mingliang Zhou Tingwen Huang 《IEEE/CAA Journal of Automatica Sinica》 SCIE EI CSCD 2024年第6期1485-1498,共14页
The Nesterov accelerated dynamical approach serves as an essential tool for addressing convex optimization problems with accelerated convergence rates.Most previous studies in this field have primarily concentrated on... The Nesterov accelerated dynamical approach serves as an essential tool for addressing convex optimization problems with accelerated convergence rates.Most previous studies in this field have primarily concentrated on unconstrained smooth con-vex optimization problems.In this paper,on the basis of primal-dual dynamical approach,Nesterov accelerated dynamical approach,projection operator and directional gradient,we present two accelerated primal-dual projection neurodynamic approaches with time scaling to address convex optimization problems with smooth and nonsmooth objective functions subject to linear and set constraints,which consist of a second-order ODE(ordinary differential equation)or differential conclusion system for the primal variables and a first-order ODE for the dual vari-ables.By satisfying specific conditions for time scaling,we demonstrate that the proposed approaches have a faster conver-gence rate.This only requires assuming convexity of the objective function.We validate the effectiveness of our proposed two accel-erated primal-dual projection neurodynamic approaches through numerical experiments. 展开更多
关键词 Accelerated projection neurodynamic approach lin-ear and set constraints projection operators smooth and nonsmooth convex optimization time scaling.
下载PDF
A Modified Proximal Gradient Method for a Family of Nonsmooth Convex Optimization Problems
2
作者 Ying-Yi Li Hai-Bin Zhang Fei Li 《Journal of the Operations Research Society of China》 EI CSCD 2017年第3期391-403,共13页
In this paper,we propose a modified proximal gradient method for solving a class of nonsmooth convex optimization problems,which arise in many contemporary statistical and signal processing applications.The proposed m... In this paper,we propose a modified proximal gradient method for solving a class of nonsmooth convex optimization problems,which arise in many contemporary statistical and signal processing applications.The proposed method adopts a new scheme to construct the descent direction based on the proximal gradient method.It is proven that the modified proximal gradient method is Q-linearly convergent without the assumption of the strong convexity of the objective function.Some numerical experiments have been conducted to evaluate the proposed method eventually. 展开更多
关键词 nonsmooth convex optimization Modified proximal gradient method Q-linear convergence
原文传递
A note on a family of proximal gradient methods for quasi-static incremental problems in elastoplastic analysis
3
作者 Yoshihiro Kanno 《Theoretical & Applied Mechanics Letters》 CAS CSCD 2020年第5期315-320,共6页
Accelerated proximal gradient methods have recently been developed for solving quasi-static incremental problems of elastoplastic analysis with some different yield criteria.It has been demonstrated through numerical ... Accelerated proximal gradient methods have recently been developed for solving quasi-static incremental problems of elastoplastic analysis with some different yield criteria.It has been demonstrated through numerical experiments that these methods can outperform conventional optimization-based approaches in computational plasticity.However,in literature these algorithms are described individually for specific yield criteria,and hence there exists no guide for application of the algorithms to other yield criteria.This short paper presents a general form of algorithm design,independent of specific forms of yield criteria,that unifies the existing proximal gradient methods.Clear interpretation is also given to each step of the presented general algorithm so that each update rule is linked to the underlying physical laws in terms of mechanical quantities. 展开更多
关键词 Elastoplastic analysis Incremental problem nonsmooth convex optimization First-order optimization method Proximal gradient method
下载PDF
Nesterov’s Smoothing and Excessive Gap Methods for an Optimization Problem in VLSI Placement
4
作者 Jian-Li Chen Yan Cui Wen-Xing Zhu 《Journal of the Operations Research Society of China》 EI 2014年第4期423-443,共21页
In this paper,we propose an algorithm for a nonsmooth convex optimization problem arising in very large-scale integrated circuit placement.The objective function is the sum of a large number of Half-Perimeter Wire Len... In this paper,we propose an algorithm for a nonsmooth convex optimization problem arising in very large-scale integrated circuit placement.The objective function is the sum of a large number of Half-Perimeter Wire Length(HPWL)functions and a strongly convex function.The algorithm is based on Nesterov’s smoothing and excessive gap techniques.The main advantage of the algorithm is that it can capture the HPWL information in the process of optimization,and every subproblem has an explicit solution in the process of optimization.The convergence rate of the algorithm is Oe1=k2T;where k is the iteration counter,which is optimal.We also present preliminary experiments on nine placement contest benchmarks.Numerical examples confirm the theoretical results. 展开更多
关键词 VLSI Global placement nonsmooth convex optimization Smoothing technique Excessive gap technique
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部