期刊文献+

基于一致性正则化的在线知识蒸馏 被引量:1

OKDCR:online knowledge distillation via consistency regularization
下载PDF
导出
摘要 在线知识蒸馏通过同时训练两个或多个模型的集合,并使之相互学习彼此的提取特征,从而实现模型性能的共同提高。已有方法侧重于模型间特征的直接对齐,从而忽略了决策边界特征的独特性和鲁棒性。利用一致性正则化来指导模型学习决策边界的判别性特征。具体地说,网络中每个模型由特征提取器和一对任务特定的分类器组成,通过正则化同一模型不同分类器间以及不同模型对应分类器间的分布距离来度量模型内和模型间的一致性,这两类一致性共同用于更新特征提取器和决策边界的特征。此外,模型内一致性将作为自适应权重,与每个模型的平均输出加权生成集成预测值,进而指导所有分类器与之相互学习。在多个公共数据集上,该算法均取得了较好的表现性能。 Online knowledge distillation trains two or more peer models simultaneously to improve the model performance via learning the extracted features from each other collaboratively.Existing methods focus on aligning the features directly but neglect to learn distinctive and robust features around the decision boundary.This paper utilized consistency regularization to guide the model to learn the discriminant features effectively.Specifically,it equipped each model with a feature extractor and a pair of task-specific classifiers.It measured intra-model consistency for each model by the distribution distance between the two classifiers of the model,and evaluated an inter-model consistency based on the classifier distributions across models.The two types of consistency guided to update the shared feature extractors and regularized the feature learning around the decision boundary.In addition,the intra-model consistency generated adaptive weights as the mean prediction of each model in the final model ensemble,and the weighted ensemble guided all classifiers to learn for joint alignment of peer models.The proposed method achieves superior classification performance consistently on multiple public datasets.
作者 张晓冰 龚海刚 刘明 Zhang Xiaobing;Gong Haigang;Liu Ming(School of Computer Science&Engineering,University of Electronic Science&Technology of China,Chengdu 611731,China)
出处 《计算机应用研究》 CSCD 北大核心 2021年第11期3249-3253,共5页 Application Research of Computers
基金 国家自然科学基金资助项目(61572113) 中央高校基金资助项目(XGBDFZ09)。
关键词 计算机视觉 模型压缩 在线知识蒸馏 一致性正则化 computer vision model compression online knowledge distillation consistency regularization
  • 相关文献

参考文献2

二级参考文献3

同被引文献2

引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部