摘要
文章探讨了联邦学习作为一种分布式人工智能框架,能够有效化解大数据发展所面临的困境。重点从联邦学习的能耗指标出发,详细阐述了其相关原理,并分析了联邦学习在能源消耗方面面临的主要挑战。同时,对降低能源消耗的基本方法进行了分类和总结,为未来相关研究提供了有益的指导。
This paper discusses that federated learning,as a distributed AI framework,can effectively resolve the dilemma faced by the development of Big data.Starting from the energy consumption index of federated learning,this paper expounds its related principles in detail,and analyzes the main challenges faced by federated learning in energy consumption.At the same time,the basic methods for reducing energy consumption were classified and summarized,providing useful guidelines for future related research.
作者
尹自豪
王红梅
白亮亮
张宏也
江川
毕志超
黄伟
YIN Zihao;WANG Hongmei;BAI Liangliang;ZHANG Hongye;JIANG Chuan;BI Zhichao;HUANG Wei(Xinjiang Institute of Engineering,Urumqi 830000,China)
出处
《计算机应用文摘》
2023年第16期87-90,93,共5页
Chinese Journal of Computer Application
基金
新疆教育厅国家级大学生创新创业项目:基于可重构智能表面的联邦学习联合设计(202210994016)。
关键词
联邦学习
无线通信
局部计算
能耗
fedrated learning
wireless communication
local computation
energy consumption