期刊文献+

基于差分隐私和秘密共享的多服务器联邦学习方案

A Multi-Server Federation Learning Scheme Based on Differential Privacy and Secret Sharing
下载PDF
导出
摘要 联邦学习依靠其中心服务器调度机制,能够在数据不出域的前提下完成多用户的联合训练。目前多数联邦学习方案及其相关的隐私保护方案都依赖于单个中心服务器完成加解密和梯度计算,一方面容易降低服务器的计算效率,另一方面一旦服务器受到外部攻击或是内部的恶意合谋,则会造成大量的隐私信息泄露。因此文章将差分隐私和秘密共享技术相结合,提出了一种多服务器的联邦学习方案。对本地用户训练的模型添加满足(ε,δ)-近似差分隐私的噪声,以防止多个服务器合谋获取隐私数据。将加噪后的梯度通过秘密共享协议分发至多服务器,保证传输梯度安全的同时利用多个服务器均衡计算负载,提高整体运算效率。基于公开数据集对该方案的模型性能、训练开销和安全性能进行了实验,结果表明该方案拥有较高的安全性,且该方案相较于明文方案,性能损耗仅为4%左右,对比单个服务器的加密方案,该方案在整体的计算开销上减少了近53%。 Federated learning relies on its central server scheduling mechanism,which can complete multi-user joint training without data leaving the domain.Most current federal learning schemes and their related privacy protection schemes rely on a single central server to complete encryption and decryption and gradient computation,which on the one hand tends to reduce the computational efficiency of the server,and on the other hand causes a large amount of privacy information leakage once the server is subject to external attacks or internal malicious collusion.Therefore,the paper combined differential privacy and secret sharing techniques to propose a multi-server federation learning scheme.Noise satisfying(ε,δ)-approximate differential privacy was added to the model trained by local users to prevent multiple servers from colluding to obtain private data.The noise-added gradients were distributed to multiple servers via a secret sharing protocol to ensure the security of the transmitted gradients while using multiple servers to balance the computational load and improve the overall computing efficiency.Experiments on the model performance,training overhead and security performance of the scheme based on public datasets show that the scheme has high security,and the performance loss of the scheme is only about 4%compared to the higher performance of the plaintext scheme,and the overall computational overhead is reduced by nearly 53% compared to the encryption scheme of a single server.
作者 陈晶 彭长根 谭伟杰 许德权 CHEN Jing;PENG Changgen;TAN Weijie;XU Dequan(State Key Laboratory of Public Big Data,Guizhou University,Guiyang 550025,China;Guizhou Big Data Academy,Guizhou University,Guiyang 550025,China)
出处 《信息网络安全》 CSCD 北大核心 2023年第7期98-110,共13页 Netinfo Security
基金 国家自然科学基金[62272124]。
关键词 联邦学习 差分隐私 秘密共享 多服务器 隐私安全 federated learning differential privacy secret sharing multi-server privacy security
  • 相关文献

参考文献3

二级参考文献5

共引文献18

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部