期刊文献+

A distributed stochastic optimization algorithm with gradient-tracking and distributed heavy-ball acceleration

原文传递
导出
摘要 Distributed optimization has been well developed in recent years due to its wide applications in machine learning and signal processing.In this paper,we focus on investigating distributed optimization to minimize a global objective.The objective is a sum of smooth and strongly convex local cost functions which are distributed over an undirected network of n nodes.In contrast to existing works,we apply a distributed heavy-ball term to improve the convergence performance of the proposed algorithm.To accelerate the convergence of existing distributed stochastic first-order gradient methods,a momentum term is combined with a gradient-tracking technique.It is shown that the proposed algorithm has better acceleration ability than GT-SAGA without increasing the complexity.Extensive experiments on real-world datasets verify the effectiveness and correctness of the proposed algorithm.
出处 《Frontiers of Information Technology & Electronic Engineering》 SCIE EI CSCD 2021年第11期1463-1476,共14页 信息与电子工程前沿(英文版)
基金 Project supported by the Open Research Fund Program of Data Recovery Key Laboratory of Sichuan Province,China(No.DRN2001) the National Natural Science Foundation of China(Nos.61773321 and 61762020) the Science and Technology Top-Notch Talents Support Project of Colleges and Universities in Guizhou Province,China(No.QJHKY2016065) the Science and Technology Foundation of Guizhou Province,China(No.QKHJC20181083) the Science and Technology Talents Fund for Excellent Young of Guizhou Province,China(No.QKHPTRC20195669)。
  • 相关文献

参考文献1

共引文献5

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部