期刊文献+

自适应特征整合与参数优化的类增量学习方法

Class Incremental Learning by Adaptive Feature Consolidation with Parameter Optimization
下载PDF
导出
摘要 针对深度网络模型在增量式场景下图片分类任务所产生的灾难性遗忘问题,提出一种自适应特征整合与权重选择的类增量学习方法。该方法以知识蒸馏作为基础框架,对前后任务模型的主干网络和分类网络的输出特征进行整合,利用蒸馏约束与自定义差异损失使当前模型具有历史旧模型的泛化能力。在增量学习阶段,对神经网络模型参数的重要性进行评价,学习新任务时,重要参数的改变会受到惩罚,从而有效防止新模型覆盖于以前任务相关的重要知识。实验结果表明,所提方法可以发掘模型的增量学习能力,有效缓解了灾难性遗忘。 Aiming at the catastrophic forgetting problem generated by deep network models for picture classification tasks in incremental scenarios,a class incremental learning method with adaptive feature consolidation and weight selec-tion is proposed.Firstly,the method uses knowledge distillation as the basic framework to integrate the output features of the backbone and classification networks of the before and after task models,and uses distillation constraints with custom disparity loss to make the current model have the generalization ability of the old historical model.In the incremental learning phase,the importance of the neural network model parameters is evaluated,while changes in important parame-ters are penalized when learning a new task,thus effectively preventing the new model from overwriting important knowl-edge related to the previous task.The experimental results show that the proposed method can explore the incremental learning ability of the model and effectively alleviate catastrophic forgetting.
作者 徐岸 吴永明 郑洋 XU An;WU Yongming;ZHENG Yang(State Key Laboratory of Public Big Data,Guizhou University,Guiyang 550025,China;Key Laboratory of Advanced Manufacturing Technology of Ministry of Education,Guizhou University,Guiyang 550025,China)
出处 《计算机工程与应用》 CSCD 北大核心 2024年第3期220-227,共8页 Computer Engineering and Applications
基金 国家自然科学基金(51505094) 贵州省科学技术基金计划(ZK[2023]一般079)。
关键词 增量学习 灾难性遗忘 参数优化 特征整合 知识蒸馏 incremental learning catastrophic forgetting parametric optimization feature consolidation knowledge distillation
  • 相关文献

参考文献4

二级参考文献9

共引文献11

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部