期刊文献+

基于联邦学习的PATE教师模型聚合优化方法

PATE Teacher Model Aggregation Optimization Method Based on Federated Learning
下载PDF
导出
摘要 教师模型全体的隐私聚合(PATE)是一种重要的隐私保护方法,但该方法中存在训练数据集小时训练模型不精确的问题。为了解决该问题,提出了一种基于联邦学习的PATE教师模型聚合优化方法。首先,将联邦学习引入到教师模型全体隐私聚合方案的教师模型训练中,用来提高训练数据少时模型训练的精确度。其次,在该优化方案中,利用差分隐私的思想来保护模型参数的隐私,以降低其被攻击的隐私泄露风险。最后,通过在MNIST数据集下验证该方案的可行性,实验结果表明,在同样的隐私保护水平下该方案训练模型的精确度优于传统的隐私聚合方案。 The privacy aggregation(PATE)of the entire teacher model is an important privacy protection method,but this method has the problem of inaccurate training models when the training dataset is small.To address this issue,the article proposes an aggregation optimization method for the PATE teacher model based on federated learning.Firstly,federated learning is intro⁃duced into the teacher model training of the teacher model privacy aggregation scheme to improve the accuracy of model training when training data is scarce.Secondly,in this optimization scheme,the idea of differential privacy is utilized to protect the privacy of model parameters and reduce the risk of privacy leakage from attacks.Finally,the feasibility of this scheme is verified on the MNIST dataset.The experimental results show that under the same level of privacy protection,the accuracy of the training model of this scheme is better than that of traditional privacy aggregation schemes.
作者 王守欣 彭长根 刘海 谭伟杰 张弘 WANG Shouxin;PENG Changgen;LIU Hai;TAN Weijie;ZHANG Hong(College of Computer Science and Technology,Guizhou University,Guiyang 550025;State Key Laboratory of Public Big Data,Guizhou University,Guiyang 550025;Guizhou Big Data Academy,Guizhou University,Guiyang 550025)
出处 《计算机与数字工程》 2023年第11期2608-2614,共7页 Computer & Digital Engineering
基金 国家自然科学基金项目“数据共享应用的块数据融合分析理论与安全管控模型研究”(编号:U1836205) 国家自然科学基金项目“基于机器学习的图数据自适应差分隐私保护模型与算法研究”(编号:62002081) 贵州大学引进人才科研项目“基于物理指纹的6G物联网内生安全关键技术研究”(编号:贵大人基合字[2020]61号) 贵州大学培育项目“基于区块链的5G内生安全关键技术研究”(编号:贵大培育[2019]56号)资助。
关键词 隐私聚合 隐私保护 联邦学习 差分隐私 privacy aggregation privacy preservation federated learning differential privacy
  • 相关文献

参考文献1

共引文献26

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部