摘要
本文考虑一类在信号处理、图像恢复和机器学习等多项科学和工程领域中具有广泛应用的约束非光滑凸优化问题.近年来,理论研究和数值实验均验证了外插项可有效提高算法的收敛速率,带有外插的临近梯度算法在求解大规模优化问题中有显著优势.因此,本文利用光滑化技巧,结合Beck和Teboulle提出的快速迭代收缩阈值算法,对一类非光滑凸优化问题提出新的加速算法,证明算法的任意聚点都是优化问题的最优解.在算法分析中,考虑光滑参数的不同更新准则,给出目标函数值O(ln k/k)的全局收敛速率,并证明迭代序列的变化趋势limk→+∞‖x^k+1-x^k‖=0.最后,通过数值实验展示本文提出的算法对两类稀疏优化问题的良好求解能力和外插项对算法收敛速率的正面影响.
In this paper, we consider a class of constrained nonsmooth convex optimization problems, which have wide applications in signal processing, imaging recovery, machine learning, etc. Recently, theoretical analysis and numerical experiments show that the extrapolation can improve the convergence rate of the algorithms efficiently,which let the proximal gradient algorithm with extrapolation have significant advantages in solving large scale optimization problems. Motivated by the smoothing techniques, and the fast iterative shrinkage-thresholding algorithm(FISTA) proposed by Beck and Teboulle, we bring forward a new accelerated algorithm for solving the considered constrained nonsmooth convex optimization problems. We prove that any accumulated point of the algorithm is an optimal solution of the problem. In the analysis of the algorithm, we consider several different update methods for the smoothing parameter, and give the global convergence rate O(ln k/k) on the objective function values. Moreover, we analyze the convergence rate of the difference on the iterates limk→+∞‖xk+1-xk‖=0.Finally, our numerical experiments illustrate the good performance of the proposed algorithm in solving two classes of sparse optimization problems and the positive influence of the extrapolation on the convergence rate.
出处
《中国科学:数学》
CSCD
北大核心
2020年第12期1651-1666,共16页
Scientia Sinica:Mathematica
基金
国家自然科学基金(批准号:11871178和61773136)资助项目。
关键词
非光滑凸优化问题
加速算法
光滑化技巧
收敛速率
nonsmooth convex optimization problem
accelerated algorithm
smoothing technique
convergence rate