期刊文献+

DeceFL:a principled fully decentralized federated learning framework 被引量:1

原文传递
导出
摘要 Traditional machine learning relies on a centralized data pipeline for model training in various applications;however,data are inherently fragmented.Such a decentralized nature of databases presents the serious challenge for collaboration:sending all decentralized datasets to a central server raises serious privacy concerns.Although there has been a joint effort in tackling such a critical issue by proposing privacy-preserving machine learning frameworks,such as federated learning,most state-of-the-art frameworks are built still in a centralized way,in which a central client is needed for collecting and distributing model information(instead of data itself)from every other client,leading to high communication burden and high vulnerability when there exists a failure at or an attack on the central client.Here we propose a principled decentralized federated learning algorithm(DeceFL),which does not require a central client and relies only on local information transmission between clients and their neighbors,representing a fully decentralized learning framework.It has been further proven that every client reaches the global minimum with zero performance gap and achieves the same convergence rate O(1=T)(where T is the number of iterations in gradient descent)as centralized federated learning when the loss function is smooth and strongly convex.Finally,the proposed algorithm has been applied to a number of applications to illustrate its effectiveness for both convex and nonconvex loss functions,time-invariant and time-varying topologies,as well as IID and Non-IID of datasets,demonstrating its applicability to a wide range of real-world medical and industrial applications.
出处 《National Science Open》 2023年第1期35-51,共17页 国家科学进展(英文)
基金 supported by the National Natural Science Foundation of China(Grant Nos.92167201,52188102,62133003,61991403,61991404,and 61991400) Jiangsu Industrial Technology Research Institute(JITRI).
  • 相关文献

同被引文献4

引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部