期刊文献+

A federated learning scheme meets dynamic differential privacy

下载PDF
导出
摘要 Federated learning is a widely used distributed learning approach in recent years,however,despite model training from collecting data become to gathering parameters,privacy violations may occur when publishing and sharing models.A dynamic approach is pro-posed to add Gaussian noise more effectively and apply differential privacy to federal deep learning.Concretely,it is abandoning the traditional way of equally distributing the privacy budget e and adjusting the privacy budget to accommodate gradient descent federation learning dynamically,where the parameters depend on computation derived to avoid the impact on the algorithm that hyperparameters are created manually.It also incorporates adaptive threshold cropping to control the sensitivity,and finally,moments accountant is used to counting the∈consumed on the privacy‐preserving,and learning is stopped only if the∈_(total)by clients setting is reached,this allows the privacy budget to be adequately explored for model training.The experimental results on real datasets show that the method training has almost the same effect as the model learning of non‐privacy,which is significantly better than the differential privacy method used by TensorFlow.
出处 《CAAI Transactions on Intelligence Technology》 SCIE EI 2023年第3期1087-1100,共14页 智能技术学报(英文)
基金 supported by the National Natural Science Foundation of China under Grant No.62062020 and No.72161005,NO.62002081,NO.62062017 Technology Founda-tion of Guizhou Province(grant no.QianKeHeJiChu‐ZK[2022]‐General184) Guizhou Provincial Science and Technology Projects[2020]1Y265.
  • 相关文献

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部