期刊文献+

基于知识蒸馏的超分辨率卷积神经网络压缩方法 被引量:7

Compression method of super-resolution convolutional neural network based on knowledge distillation
下载PDF
导出
摘要 针对目前用于超分辨率图像重建的深度学习网络模型结构深且计算复杂度高,以及存储网络模型所需空间大,进而导致其无法在资源受限的设备上有效运行的问题,提出一种基于知识蒸馏的超分辨率卷积神经网络的压缩方法。该方法使用一个参数多、重建效果好的教师网络和一个参数少、重建效果较差的学生网络。首先训练好教师网络,然后使用知识蒸馏的方法将知识从教师网络转移到学生网络,最后在不改变学生网络的网络结构及参数量的前提下提升学生网络的重建效果。实验使用峰值信噪比(PSNR)评估重建质量的结果,使用知识蒸馏方法的学生网络与不使用知识蒸馏方法的学生网络相比,在放大倍数为3时,在4个公开测试集上的PSNR提升量分别为0.53dB、0.37dB、0.24dB和0.45dB。在不改变学生网络结构的前提下,所提方法显著地改善了学生网络的超分辨率重建效果。 Aiming at the deep structure and high computational complexity of current network models based on deep learning for super-resolution image reconstruction, as well as the problem that the networks can not operate effectively on resource-constrained devices caused by the high storage space requirement for the network models, a super- resolution convolutional neural network compression method based on knowledge distillation was proposed. This method utilizes a teacher network with large parameters and good reconstruction effect as well as a student network with few parameters and poor reconstruction effect. Firstly the teacher network was trained;then knowledge distillation method was used to transfer knowledge from teacher network to student network;finally the reconstruction effect of the student network was improved without changing the network structure and the parameters of the student network. The Peak Signal-to-Noise Ratio (PSNR) was used to evaluate the quality of reconstruction in the experiments. Compared to the student network without knowledge distillation method, the student network using the knowledge distillation method has the PSNR increased by 0.53 dB, 0.37 dB, 0.24 dB and 0.45 dB respectively on four public test sets when the magnification times is3. Without changing the structure of student network, the proposed method significantly improves the super- resolution reconstruction effect of the student network.
作者 高钦泉 赵岩 李根 童同 GAO Qinquan;ZHAO Yan;LI Gen;TONG Tong(College of Physics and Information Engineering, Fuzhou University, Fuzhou Fujian 350108, China;Key Laboratory of Medical Instrumentation & Pharmaceutical Technology of Fujian Province (Fuzhou University ), Fuzhou Fujian 350108,China;Imperial Vision Technology Company Limited,Fuzhou Fujian 350001, China)
出处 《计算机应用》 CSCD 北大核心 2019年第10期2802-2808,共7页 journal of Computer Applications
基金 国家自然科学基金资助项目(61802065)~~
关键词 超分辨率 知识蒸馏 卷积神经网络压缩 教师网络 学生网络 super-resolution knowledgedistillation convolutional neural networkcompression teacher network student network
  • 相关文献

同被引文献29

引证文献7

二级引证文献11

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部