期刊文献+

联邦学习隐私模型发布综述 被引量:5

Survey on private model publishing for federated learning
下载PDF
导出
摘要 联邦学习这一类分布式机器学习技术旨在保证使用大数据进行机器学习训练时保护本地数据不泄露.然而一系列机器学习隐私攻击表明,即使不直接暴露本地数据,仅仅通过获取机器学习模型的参数就可以进行数据隐私的窃取.从训练时参与者和聚合端之间传递的中间模型到最后发布的聚合模型,联邦学习的模型发布过程存在诸多隐私威胁.由此出现了大量相关的保护技术,包括基于差分隐私以及基于密码学的联邦学习隐私保护技术.本文针对联邦学习本地模型和聚合模型发布过程中可能出现的各种隐私威胁和敌手模型进行了简要介绍,并且对相关的防御技术和研究成果进行系统性综述.同时也对相关技术在联邦学习隐私保护中的发展趋势进行了展望. Federated learning is a kind of distributed machine learning technology to ensure that local data is not compromised when training with big data for machine learning models.However, a series of attacks shows that the adversary can steal private information from machine learning model parameters even if local data is inaccessible.Thus, many privacy threats must be mitigated, since they can arise from the intermediate model parameters transmitted between participants and the aggregator in the training phase or from the finally released aggregated model.Therefore, various privacy-preserving federated learning approaches have emerged, primarily based on cryptography and differential privacy technology.This paper surveys the privacy threats and adversary models that may appear when we publish local models and aggregated model of federated learning.Furthermore, we systematically summarize the related defense technologies and research advances.Finally, we also presents a prospect for the development trend of privacy-preserving federated learning.
作者 石聪聪 高先周 黄秀丽 毛云龙 SHI Congcong;GAO Xianzhou;HUANG Xiuli;MAO Yunlong(Global Energy Interconnection Research Institute Nanjing Branch/State Grid Key Laboratory of Information&Network Security,Nanjing 210094;Department of Computer Science and Technology,Nanjing University,Nanjing 210023)
出处 《南京信息工程大学学报(自然科学版)》 CAS 北大核心 2022年第2期127-136,共10页 Journal of Nanjing University of Information Science & Technology(Natural Science Edition)
基金 国家电网有限公司总部管理科技项目(5700-202190184A-0-0-00)。
关键词 联邦学习 隐私保护 差分隐私 federated learning privacy-preserving differential privacy
  • 相关文献

同被引文献72

引证文献5

二级引证文献11

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部