期刊文献+
共找到1篇文章
< 1 >
每页显示 20 50 100
A distributed stochastic optimization algorithm with gradient-tracking and distributed heavy-ball acceleration
1
作者 bihao sun Jinhui HU +1 位作者 Dawen XIA Huaqing LI 《Frontiers of Information Technology & Electronic Engineering》 SCIE EI CSCD 2021年第11期1463-1476,共14页
Distributed optimization has been well developed in recent years due to its wide applications in machine learning and signal processing.In this paper,we focus on investigating distributed optimization to minimize a gl... Distributed optimization has been well developed in recent years due to its wide applications in machine learning and signal processing.In this paper,we focus on investigating distributed optimization to minimize a global objective.The objective is a sum of smooth and strongly convex local cost functions which are distributed over an undirected network of n nodes.In contrast to existing works,we apply a distributed heavy-ball term to improve the convergence performance of the proposed algorithm.To accelerate the convergence of existing distributed stochastic first-order gradient methods,a momentum term is combined with a gradient-tracking technique.It is shown that the proposed algorithm has better acceleration ability than GT-SAGA without increasing the complexity.Extensive experiments on real-world datasets verify the effectiveness and correctness of the proposed algorithm. 展开更多
关键词 Distributed optimization High-performance algorithm Multi-agent system Machine-learning problem Stochastic gradient
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部