期刊文献+

A Primal-Dual SGD Algorithm for Distributed Nonconvex Optimization 被引量:3

下载PDF
导出
摘要 The distributed nonconvex optimization problem of minimizing a global cost function formed by a sum of n local cost functions by using local information exchange is considered.This problem is an important component of many machine learning techniques with data parallelism,such as deep learning and federated learning.We propose a distributed primal-dual stochastic gradient descent(SGD)algorithm,suitable for arbitrarily connected communication networks and any smooth(possibly nonconvex)cost functions.We show that the proposed algorithm achieves the linear speedup convergence rate O(1/(√nT))for general nonconvex cost functions and the linear speedup convergence rate O(1/(nT)) when the global cost function satisfies the Polyak-Lojasiewicz(P-L)condition,where T is the total number of iterations.We also show that the output of the proposed algorithm with constant parameters linearly converges to a neighborhood of a global optimum.We demonstrate through numerical experiments the efficiency of our algorithm in comparison with the baseline centralized SGD and recently proposed distributed SGD algorithms.
出处 《IEEE/CAA Journal of Automatica Sinica》 SCIE EI CSCD 2022年第5期812-833,共22页 自动化学报(英文版)
基金 supported by the Knut and Alice Wallenberg Foundation the Swedish Foundation for Strategic Research the Swedish Research Council the National Natural Science Foundation of China(62133003,61991403,61991404,61991400)。
  • 相关文献

同被引文献30

引证文献3

二级引证文献3

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部