摘要
联邦学习是一种多设备参与的,保护数据隐私的深度学习技术.它能够在私有数据不出本地的同时训练全局共享模型.然而,在复杂的物联网环境中,联邦学习面临着统计异构性和系统异构性的挑战.不同的本地数据分布和高额的通信计算成本,使得过参数化的模型不适合在物联网应用中直接部署.同时,非独立同分布的数据也使采用参数平均聚合的联邦学习更加难以收敛.联邦学习场景下的研究难点在于,如何根据私有数据为每个客户端建立个性化的轻量级模型的同时,把这些模型汇总成为联合模型.为了解决这一问题,本文提出了一种基于进化策略的自适应联邦学习算法.该方法将模型结构进行编码,把每个参与者视作进化策略中的个体,通过全局优化来为每个客户端自适应地生成不同的个性化子模型.客户端根据网络单元重要性和编码在服务器端超网中抽取相应的子网来进行本地更新,而这种网络局部更新的方法天然契合dropout的思想.在真实数据集上进行的大量实验证明,本文提出的框架相比于经典的联邦学习方法,模型性能得到了显著改善.在客户端数据非独立同分布的情况下,该算法在有效降低了客户端在通信带宽和计算力受限条件下参与联邦学习门槛的同时,提高了全局模型的泛化能力.
Federated learning is a deep learning technique that ensures data privacy with multiple device participation by training a globally shared model while storing private data locally.However,in a complex Internet-ofThings(IoT)environment,federated learning faces challenges of statistical heterogeneity and systematic heterogeneity.Because of different local data distributions and high communication costs,over-parameterized models are unsuited for direct deployment in IoT applications.Moreover,nonindependent,identically distributed data make federated learning with parameter-averaging aggregation more difficult to converge.Determining how to build personalized lightweight models for each client based on individual data and then aggregate these models has become a research problem with regard to federated learning.To solve this problem,we propose an adaptive federated learning algorithm based on an evolution strategy.The method regards each participant as an individual by encoding the model architecture through an evolution strategy,and it can adaptively generate a different customized subnet for each client through global optimization.According to the importance of the network unit and genotype,clients extract the corresponding subnets from the server-side supernets and perform local updates,which naturally fits the idea of the dropout.Extensive experiments on real-world datasets demonstrate that the proposed framework considerably improves the model performance compared with conventional federated learning.In particular,when the local data is not independent and uniformly distributed,the framework facilitates clients with limited communication bandwidths and computing power to participate in federated learning;the generalization ability of the global model is improved.
作者
公茂果
高原
王炯乾
张元侨
王善峰
谢飞
Maoguo GONG;Yuan GAO;Jiongqian WANG;Yuanqiao ZHANG;Shanfeng WANG;Fei XIE(Key Laboratory of Intelligent Perception and Image Understanding of Ministry of Education,Xidian University,Xi'an 710071,China;School of Cyber Engineering,Xidian University,Xi'an 710071,China;Academy of Advanced Interdisciplinary Research,Xidian University,Xi'an 710068,China)
出处
《中国科学:信息科学》
CSCD
北大核心
2023年第3期437-453,共17页
Scientia Sinica(Informationis)
基金
国家自然科学基金(批准号:62036006,61973249)
陕西省重点研发计划(批准号:2021ZDLGY02-06)资助项目。
关键词
联邦学习
进化策略
模型编码
网络剪枝
本地个性化
federated learning
evolution strategy
model encoding
network pruning
local customization