期刊文献+
共找到1篇文章
< 1 >
每页显示 20 50 100
DP-ASSGD: Differential Privacy Protection Based on Stochastic Gradient Descent Optimization
1
作者 Qiang Gao Han Sun Zhifang Wang 《国际计算机前沿大会会议论文集》 EI 2023年第1期298-308,共11页
Recently,differential privacy algorithms based on deep learning have become increasingly mature.Previous studies provide privacy mostly by adding differential privacy noise to the gradient,but it will reduce the accur... Recently,differential privacy algorithms based on deep learning have become increasingly mature.Previous studies provide privacy mostly by adding differential privacy noise to the gradient,but it will reduce the accuracy,and it is difficult to balance privacy and accuracy.In this paper,the DP-ASSGD algo-rithm is proposed to counterpoise privacy and accuracy.The convergence speed is improved,the number of optimized iterations is decreased,and the privacy loss is significantly reduced.On the other hand,by using the postprocessing immunity characteristics of the differential privacy model,the Laplace smoothing mecha-nism is added to make the training process more stable and the generalization ability stronger.The experiment uses the MNIST dataset,with the same privacy budget,and compared with the existing differential privacy algorithms,the accu-racy is improved by 1.8%on average.When achieving the same accuracy,the DP-ASSGD algorithm consumes less privacy budget. 展开更多
关键词 differential privacy protection Learning rate adaptation Laplace smoothing
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部