Federated learning(FL)is an emerging privacy-preserving distributed computing paradigm,enabling numerous clients to collaboratively train machine learning models without the necessity of transmitting clients’private ...Federated learning(FL)is an emerging privacy-preserving distributed computing paradigm,enabling numerous clients to collaboratively train machine learning models without the necessity of transmitting clients’private datasets to the central server.Unlike most existing research where the local datasets of clients are assumed to be unchanged over time throughout the whole FL process,our study addresses such scenarios in this paper where clients’datasets need to be updated periodically,and the server can incentivize clients to employ as fresh as possible datasets for local model training.Our primary objective is to design a client selection strategy to minimize the loss of the global model for FL loss within a constrained budget.To this end,we introduce the concept of“Age of Information”(AoI)to quantitatively assess the freshness of local datasets and conduct a theoretical analysis of the convergence bound in our AoI-aware FL system.Based on the convergence bound,we further formulate our problem as a restless multi-armed bandit(RMAB)problem.Next,we relax the RMAB problem and apply the Lagrangian Dual approach to decouple it into multiple subproblems.Finally,we propose a Whittle’s Index Based Client Selection(WICS)algorithm to determine the set of selected clients.In addition,comprehensive simulations substantiate that the proposed algorithm can effectively reduce training loss and enhance the learning accuracy compared with some state-of-the-art methods.展开更多
基金supported by the National Natural Science Foundation of China under Grant No.62172386the Natural Science Foundation of Jiangsu Province of China under Grant No.BK20231212the Teaching Research Project of the Education Department of Anhui Province of China under Grant No.2021jyxm1738.
文摘Federated learning(FL)is an emerging privacy-preserving distributed computing paradigm,enabling numerous clients to collaboratively train machine learning models without the necessity of transmitting clients’private datasets to the central server.Unlike most existing research where the local datasets of clients are assumed to be unchanged over time throughout the whole FL process,our study addresses such scenarios in this paper where clients’datasets need to be updated periodically,and the server can incentivize clients to employ as fresh as possible datasets for local model training.Our primary objective is to design a client selection strategy to minimize the loss of the global model for FL loss within a constrained budget.To this end,we introduce the concept of“Age of Information”(AoI)to quantitatively assess the freshness of local datasets and conduct a theoretical analysis of the convergence bound in our AoI-aware FL system.Based on the convergence bound,we further formulate our problem as a restless multi-armed bandit(RMAB)problem.Next,we relax the RMAB problem and apply the Lagrangian Dual approach to decouple it into multiple subproblems.Finally,we propose a Whittle’s Index Based Client Selection(WICS)algorithm to determine the set of selected clients.In addition,comprehensive simulations substantiate that the proposed algorithm can effectively reduce training loss and enhance the learning accuracy compared with some state-of-the-art methods.