期刊文献+

基于原型学习的联邦持续学习方法

Federated continual learning based on prototype learning
原文传递
导出
摘要 联邦学习能够在隐私保护的前提下联合多个参与者进行协同学习,然而经典的联邦学习不具备持续学习的能力,无法适应动态变化的应用场景.联邦持续学习近期引起了广泛的关注,其允许多个参与者在协同学习的同时进行持续学习.联邦持续学习是一种更加复杂的学习场景,其面临的挑战包括灾难性遗忘,异构性以及通信资源受限.为了应对这些挑战,本文提出一种基于原型学习的联邦持续学习方法.该方法利用原型进行知识的共享,提升通信效率的同时增强了对模型异构性的适应能力.此外,该方法设计了基于知识蒸馏和回放的灾难性遗忘的预防机制.本文提供了所提出方法的收敛性分析,并且通过对比实验和消融实验验证了该方法的有效性. Federated learning allows multiple participants to collaborate on training models while preserving privacy.However,traditional federated learning methods do not support continuous learning and are not wellsuited for dynamic scenarios.Recently,federated continual learning has emerged as a promising approach that enables ongoing learning and collaboration among participants.This scenario introduces additional complexities,such as catastrophic forgetting,heterogeneity,and limited communication resources.Considerably,this paper proposes a prototype-based federated continual learning approach.The proposed method uses prototypes for knowledge transfer,which improves communication efficiency and adaptability to model heterogeneity.Additionally,we introduce a mechanism to mitigate catastrophic forgetting through knowledge distillation and replay.We also provide a convergence analysis of the proposed method and validate its effectiveness through comparative and ablation experiments.
作者 张浩东 杨柳 于剑 胡清华 景丽萍 Hao-Dong ZHANG;Liu YANG;Jian YU;Qing-Hua HU;Li-Ping JING(College of Intelligence and Computing,Tianjin University,Tianjin 300354,China;School of Computer and Information Technology,Beijing Jiaotong University,Beijing 100044,China)
出处 《中国科学:信息科学》 CSCD 北大核心 2024年第10期2428-2442,共15页 Scientia Sinica(Informationis)
基金 国家自然科学基金(批准号:62076179,61925602,U23B2049,U23B2062)资助项目。
关键词 联邦持续学习 原型学习 知识蒸馏 灾难性遗忘 数据异构 federated continual learning prototype learning knowledge distillation catastrophic forgetting data heterogeneity
  • 相关文献

参考文献5

二级参考文献11

共引文献23

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部