期刊文献+

Federated Learning Based on Data Divergence and Differential Privacy in Financial Risk Control Research

下载PDF
导出
摘要 In the financial sector, data are highly confidential and sensitive,and ensuring data privacy is critical. Sample fusion is the basis of horizontalfederation learning, but it is suitable only for scenarios where customershave the same format but different targets, namely for scenarios with strongfeature overlapping and weak user overlapping. To solve this limitation, thispaper proposes a federated learning-based model with local data sharing anddifferential privacy. The indexing mechanism of differential privacy is used toobtain different degrees of privacy budgets, which are applied to the gradientaccording to the contribution degree to ensure privacy without affectingaccuracy. In addition, data sharing is performed to improve the utility ofthe global model. Further, the distributed prediction model is used to predictcustomers’ loan propensity on the premise of protecting user privacy. Usingan aggregation mechanism based on federated learning can help to train themodel on distributed data without exposing local data. The proposed methodis verified by experiments, and experimental results show that for non-iiddata, the proposed method can effectively improve data accuracy and reducethe impact of sample tilt. The proposed method can be extended to edgecomputing, blockchain, and the Industrial Internet of Things (IIoT) fields.The theoretical analysis and experimental results show that the proposedmethod can ensure the privacy and accuracy of the federated learning processand can also improve the model utility for non-iid data by 7% compared tothe federated averaging method (FedAvg).
出处 《Computers, Materials & Continua》 SCIE EI 2023年第4期863-878,共16页 计算机、材料和连续体(英文)
基金 supported by the National Natural Science Foundation (NSFC),China,under the National Natural Science Foundation Youth Fund program (J.Hao,No.62101275).
  • 相关文献

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部