期刊文献+
共找到2篇文章
< 1 >
每页显示 20 50 100
SIGNGD with Error Feedback Meets Lazily Aggregated Technique:Communication-Efficient Algorithms for Distributed Learning
1
作者 Xiaoge Deng Tao Sun +1 位作者 Feng Liu Dongsheng Li 《Tsinghua Science and Technology》 SCIE EI CAS CSCD 2022年第1期174-185,共12页
The proliferation of massive datasets has led to significant interests in distributed algorithms for solving large-scale machine learning problems.However,the communication overhead is a major bottleneck that hampers ... The proliferation of massive datasets has led to significant interests in distributed algorithms for solving large-scale machine learning problems.However,the communication overhead is a major bottleneck that hampers the scalability of distributed machine learning systems.In this paper,we design two communication-efficient algorithms for distributed learning tasks.The first one is named EF-SIGNGD,in which we use the 1-bit(sign-based) gradient quantization method to save the communication bits.Moreover,the error feedback technique,i.e.,incorporating the error made by the compression operator into the next step,is employed for the convergence guarantee.The second algorithm is called LE-SIGNGD,in which we introduce a well-designed lazy gradient aggregation rule to EF-SIGNGD that can detect the gradients with small changes and reuse the outdated information.LE-SIGNGD saves communication costs both in transmitted bits and communication rounds.Furthermore,we show that LE-SIGNGD is convergent under some mild assumptions.The effectiveness of the two proposed algorithms is demonstrated through experiments on both real and synthetic data. 展开更多
关键词 distributed learning communication-efficient algorithm convergence analysis
原文传递
A review of distributed statistical inference
2
作者 Yuan Gao Weidong Liu +3 位作者 Hansheng Wang Xiaozhou Wang Yibo Yan Riquan Zhang 《Statistical Theory and Related Fields》 2022年第2期89-99,共11页
The rapid emergence of massive datasets in various fields poses a serious challenge to tra-ditional statistical methods.Meanwhile,it provides opportunities for researchers to develop novel algorithms.Inspired by the i... The rapid emergence of massive datasets in various fields poses a serious challenge to tra-ditional statistical methods.Meanwhile,it provides opportunities for researchers to develop novel algorithms.Inspired by the idea of divide-and-conquer,various distributed frameworks for statistical estimation and inference have been proposed.They were developed to deal with large-scale statistical optimization problems.This paper aims to provide a comprehensive review for related literature.It includes parametric models,nonparametric models,and other frequently used models.Their key ideas and theoretical properties are summarized.The trade-off between communication cost and estimate precision together with other concerns is discussed. 展开更多
关键词 Distributed computing DIVIDE-AND-CONQUER communication-efficiency shrinkage methods nonparametric estimation principal component analysis feature screening BOOTSTRAP
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部