期刊文献+

一类非光滑凸优化问题的邻近梯度算法

A proximal gradient method for nonsmooth convex optimization problems
下载PDF
导出
摘要 考虑求解目标函数为光滑损失函数与非光滑正则函数之和的凸优化问题的一种基于线搜索的邻近梯度算法及其收敛性分析,证明了在梯度局部Lipschitz连续条件下该算法是R-线性收敛的,并在非光滑部分为稀疏块LASSO正则函数情况下给出了误差界条件成立的证明,得到了线性收敛率。最后,数值实验结果验证了方法的有效性。 A Proximal Gradient Method based on linesearch(L-PGM)and its convergence for solving the convex optimization problems which objective function is the sum of smooth loss function and non-smooth regular function are studied in this paper.Considering the loss function’s gradient is locally Lipschitz continuous in the problems,the R-linear convergence rate of the L-PGM method is proved.Then,focusing on the problems regularized by the sparse group Lasso function,we prove that the error bound holds around the optimal solution set,thus,the linear convergence for solving such problems with the L-PGM method is given.Finally,The preliminary experimental results support our theoretical analysis.
作者 李红武 谢敏 张榕 LI Hongwu;XIE Min;ZHANG Rong(College of Applied Sciences,Beijing University of Tech-nology,Beijing 100124,China;School of mathematics and statistics,Nanyang Normal University,Nanyang 473061,Henan,China;Hanergy Thin Film Power Group Head Quaters,Beijing 100101,China)
出处 《运筹学学报》 CSCD 北大核心 2021年第1期61-72,共12页 Operations Research Transactions
基金 国家自然科学基金(No.11771003)。
关键词 非光滑凸优化 邻近梯度法 局部Lipschitz连续 误差界 线性收敛 nonsmooth convex optimization proximal gradient method locally lipschitz continuous error bound linear convergence
  • 相关文献

参考文献1

共引文献3

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部