期刊文献+

支持隐私保护的端云协同训练 被引量:1

Privacy-preserving cloud-end collaborative training
下载PDF
导出
摘要 我国在数据资源上具有规模化和多样化的优势,在移动互联网数据应用上具有后发优势,在丰富的应用场景下产生了海量数据,推荐系统可以从大规模数据中挖掘有价值的信息,缓解信息过载问题.已有的工作聚焦于集中式推荐,数据在云侧训练.随着数据安全和隐私保护问题的日益突出,从端侧设备收集用户数据变得越发困难,这使得集中式推荐变得不可行.以去中心化的方式,利用端侧设备和云服务器的优势,充分考虑数据安全与隐私保护问题,面向推荐系统,提出了一个基于联邦机器学习(federated machine learning, FedML)与移动神经网络(mobile neural network, MNN)的端云协同训练方法 FedMNN(federated machine learning and mobile neural network).具体分为3部分:首先,将多种深度学习框架实现的云侧模型以ONNX (open neural network exchange)作为中间框架通过MNN模型转换工具转换成通用MNN模型供端侧设备训练;然后,云侧将模型下发给端侧设备,端侧初始化后,获取本地数据进行训练并计算损失,再执行梯度反向传播;最后,端侧训练后的模型反馈给云侧,通过联邦学习框架进行模型聚合与更新,再根据不同需求,将云侧模型按需部署到端侧设备上,实现端云协同.实验通过对比FedMNN和FLTFlite (flower and TensorFlow lite)框架在基准任务上的功耗,发现FedMNN比FLTFlite低32%~51%,并以DSSM (deep structured semantic model)和Deep and Wide这2个推荐模型为例,实验验证了端云协同训练的有效性. China has the advantages of scale and diversity in data resources,and mobile internet data applications,which generate massive amounts of data in diverse application scenarios,recommendation systems have the capability to extract valuable information from this massive amounts of data,thereby mitigating the problem of information overload.Most existing research on recommendation systems focused on centralized recommender systems,training the data on the cloud centrally.However,with increasingly prominent data security and privacy protection issues,collecting user data has become increasingly difficult,making centralized recommendation methods infeasible.This study focuses on privacy-preserving cloud-end collaborative training in a decentralized manner for personalized recommender systems.To fully utilize the advantages of end devices and cloud servers while considering privacy and security issues,a cloud-end collaborative training method named FedMNN(federated machine learning and mobile neural network)is proposed for recommender systems based on federated machine learning(FedML)and a mobile neural network(MNN).The proposed method was divided into three parts:First,cloud-based models implemented in various deep learning frameworks were converted into general MNN models for end-device training using the ONNX(open neural network exchange)intermediate framework and a MNN model conversion tool.Second,the cloud server sends the model to the end-side devices,which initialized and obtain local data for training and loss calculation,followed by gradient back-propagation.Finally,the endside models are fed back to the cloud server for model aggregation and updating.Depending on different requirements,the cloud model was deployed on end-side devices as required,achieving end-cloud collaboration.Experiments comparing power consumption of the proposed FedMNN and FLTFlite(flower and TensorFlow lite)frameworks on benchmark tasks identified that FedMNN is 32%to 51%lower than FLTFlite.Using DSSM(deep structured semantic model)and deep and wide recommendation models,the experimental results demonstrated the effectiveness of the proposed cloud-end collaborative training method.
作者 高祥云 孟丹 罗明凯 王俊 张丽平 孔超 GAO Xiangyun;MENG Dan;LUO Mingkai;WANG Jun;ZHANG Liping;KONG Chao(School of Computer and Information,Anhui Polytechnic University,Wuhu,Anhui 241000,China;OPPO Research Institute,Shenzhen,Guangdong 518000,China;College of Electronic and Information Engineering,Tongji University,Shanghai 201804,China;Reconfigurable and Intelligent Computing Laboratory,Anhui Polytechnic University,Wuhu,Anhui 241000,China)
出处 《华东师范大学学报(自然科学版)》 CAS CSCD 北大核心 2023年第5期77-89,共13页 Journal of East China Normal University(Natural Science)
基金 国家自然科学基金(61902001) 安徽工程大学本科教学质量提升计划项目(2022lzyybj02)。
关键词 隐私保护 联邦学习 机器学习 端云协同训练 privacy protection federated learning machine learning cloud-end collaborative training
  • 相关文献

参考文献2

二级参考文献1

共引文献59

同被引文献4

引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部