期刊文献+

面向图像语义分割任务的多级注意力蒸馏学习

MAD:Multi attention distillation learning for semantic segmentation
下载PDF
导出
摘要 传统的蒸馏学习仅通过大网络对轻量网络进行单向蒸馏,不但难以从轻量网络的学习状态中得到反馈信息,对训练过程进行优化调整,同时还会限制轻量网络的特征表达能力。本文提出结合自身多级注意力上下文信息进行自我学习优化的方法(MAD,Multi Attention Distillation),以自监督的方式使自身成熟的部分约束不成熟部分,即浅层可以从深层中提取有用的上下文信息,让浅层特征学习高层特征的表达,从而提升网络的整体表达能力。使用轻量级网络ERFNet、DeepLabV3在两个不同任务的数据集CULane、VOC上进行验证。实验结果表明,MAD可以在不增加推理时间的前提下,提升网络的特征提取能力,使ERFNet在CULane任务的F1-measure指标提升2.13,DeepLabV3在VOC任务的mIoU指标提升1.5。 Traditional distillation learning only uses one-way distillation of large network to light-weight network,which not only is difficult to get feedback information from the learning state of light-weight network and optimize the training process,but also limits the feature expression ability of light-weight network. This paper proposes a self-learning optimization method(MAD,Multi Attention Distillation)based on multi-level attention context information,which makes the mature part restrain the immature part by self-monitoring,that is,the shallow layer can extract useful context information from the deep layer,and let the shallow layer learn the expression of high-level features,so as to improve the overall expression ability of the network. Using lightweight network ERFnet and DeepLabV3 to verify on two different task datasets,CULane and VOC,mad can improve the network performance without increasing the reasoning time. Improved the F1-measure index of ERFNet in CULane task by2.13,and improved the mIoU index of DeepLabV3 in VOC task by1.5.
作者 刘佳琦 杨璐 王龙志 LIU Jiaqi;YANG Lu;WANG Longzhi(Tianjin Key Laboratory for Advanced Mechatronic System Design and Intelligent Control,School of Mechanical Engineering,Tianjin University of Technology,Tianjin 300384,China;National Demonstration Center for Experimental Mechanical and Electrical Engineering Education(Tianjin University of Technology),Tianjin 300384,China;Autobrain(Tianjin)Technology,LTD,Tianjin 300300,China)
出处 《智能计算机与应用》 2021年第5期13-18,25,共7页 Intelligent Computer and Applications
基金 天津市自然科学基金(16JCQNJC04100)。
关键词 蒸馏学习 语义分割 注意力 卷积神经网络 distillation learning semantic segmentation attention convolutional neural network
  • 相关文献

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部