摘要
针对联邦学习中非独立同分布的样本导致模型出现收敛慢、训练不稳定等问题,采用琴生-香农(JS)散度衡量不同用户的数据分布差异,对联邦学习的分布差异最小化问题进行了数学建模,提出分布差异可感知的联邦学习方法,通过数值实验验证了该方法的有效性。实验表明,经过优化的数据分布,可有效加快联邦模型的训练准确率,使模型收敛到更稳定的状态。
To address the problem of slow convergence and unstable training of the model caused by non-independent identically distribution data in federated learning,the Jensen-Shannon(JS)divergence is calculated to evaluate the divergence,the minimization of distribution difference for the federated learning is modeled,and a federated learning method with awareness of the distribution difference is proposed.Numerical experiments are conducted to verify the effectiveness of this method.Experiment results show that optimized data distribution can effectively accelerate the training accuracy and make the model converge to a stable state.
作者
胡智尧
于淼
HU Zhiyao;YU Miao(Academy of Military Sciences,Beijing 100091,China)
出处
《信息工程大学学报》
2024年第4期404-410,共7页
Journal of Information Engineering University
基金
国家自然科学基金青年科学基金(62025208)。
关键词
联邦学习
数据共享
非独立同分布
分布差异
梯度下降
federated learning
data sharing
non-independent identically distribution
distribution discrepancy