期刊文献+

面向小样本学习的轻量化知识蒸馏

Lightweight Knowledge Distillation for Few-shot Learning
下载PDF
导出
摘要 小样本学习旨在模拟人类基于少数样例快速学习新事物的能力,对解决样本匮乏情境下的深度学习任务具有重要意义.但是,在诸多计算资源有限的现实任务中,模型规模仍可能限制小样本学习的广泛应用.这对面向小样本学习的轻量化任务提出了现实的需求.知识蒸馏作为深度学习领域广泛使用的辅助策略,通过额外的监督信息实现模型间知识迁移,在提升模型精度和压缩模型规模方面都有实际应用.首先验证知识蒸馏策略在小样本学习模型轻量化中的有效性.并结合小样本学习任务的特点,针对性地设计两种新的小样本蒸馏方法:(1)基于图像局部特征的蒸馏方法;(2)基于辅助分类器的蒸馏方法.在miniImageNet和TieredImageNet数据集上的相关实验证明所设计的新的蒸馏方法相较于传统知识蒸馏在小样本学习任务上具有显著优越性. Few-shot learning aims at simulating the ability of human beings to quickly learn new things with only few samples,which is of great significance for deep learning tasks when samples are limited.However,in many practical tasks with limited computing resources,the model scale may still limit a wider application of few-shot learning.This study presents a realistic requirement for lightweight tasks for few-shot learning.As a widely used auxiliary strategy in deep learning,knowledge distillation transfers knowledge between models by using additional supervised information,which has practical application in both improving model accuracy and reducing model scale.This study first verifies the effectiveness of the knowledge distillation strategy in model lightweight for few-shot learning.Then according to the characteristics of few-shot learning,two new distillation methods for few-shot learning are designed:(1)distillation based on image local features;(2)distillation based on auxiliary classifiers.Experiments on miniImageNet and TieredImageNet datasets demonstrate that the new distillation methods are significantly superior to traditional knowledge distillation in few-shot learning tasks.
作者 陈嘉言 任东东 李文斌 霍静 高阳 CHEN Jia-Yan;REN Dong-Dong;LI Wen-Bin;HUO Jing;GAO Yang(Department of Computer Science and Technology,Nanjing University,Nanjing 210023,China)
出处 《软件学报》 EI CSCD 北大核心 2024年第5期2414-2429,共16页 Journal of Software
基金 国家自然科学基金(62106100,62192783,62276128) 江苏省自然科学基金(BK20221441) 江苏省双创博士项目(JSSCBS20210021)。
关键词 深度学习 小样本学习 图像识别 知识蒸馏 模型轻量化 deep learning few-shot learning image recognition knowledge distillation(KD) model lightweight
  • 相关文献

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部