期刊文献+

基于掩膜的目标检测模型蒸馏方法

Knowledge distillation method of object detection model based on mask
下载PDF
导出
摘要 针对之前知识蒸馏算法忽略了对背景的蒸馏以及未对前景的重要性进行区分的问题,提出一种用于平衡教师网络前景信息与背景信息的带权掩膜。通过对前景与背景的重要程度进行区分,保留部分背景知识帮助学生网络学习教师网络的泛化;对前景知识的重要程度进行区分,让学生网络知道学习的重点,更好学习教师网络的知识。基于Yolov3模型在COCO2017数据集上进行蒸馏训练,在模型参数量没有提升的前提下检测精度mAP从0.247提升到0.436。 Aiming at the problems that previous knowledge distillation algorithms ignore the distillation of background and fail to distinguish the importance of foreground information,a weighted mask was proposed to balance the foreground information and background information of teacher networks.By distinguishing the importance of foreground and background information,part of the background knowledge was retained to help students networks learn the generalization of teachers networks.The importance of foreground knowledge was differentiated to let students networks know the key points of learning,so as to better learn the knowledge of the teacher network.Distillation training was conducted on the COCO2017 dataset based on the Yolov3 model.The detection accuracy mAP is improved from 0.247 to 0.436 without improving the model parameter amount.
作者 陈宇轩 张本亭 濮约刚 张明庆 CHEN Yu-xuan;ZHANG Ben-ting;PU Yue-gang;ZHANG Ming-qing(Institute 706,Second Academy of China Aerospace Science and Industry Corporation,Beijing 100854,China)
出处 《计算机工程与设计》 北大核心 2023年第9期2822-2828,共7页 Computer Engineering and Design
关键词 模型部署 模型压缩 知识蒸馏 目标检测 掩膜 置信图 全局关系 model deployment model compression knowledge distillation object detection mask confidence map global relation
  • 相关文献

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部