摘要
针对目前图像超分辨率重建方法中未能充分利用图像全局信息和局部信息,导致重构结果缺失源图像部分关键信息的问题,提出了一种多尺度密集残差网络来实现图像的超分辨率重建。网络以密集残差为基础,融合了图像的多尺度特征信息,保证了网络在深度上不损失特征信息的同时,获得更多不同感受野下的特征信息,避免源图像关键信息的过度丢失。此外,为从具有低频冗余信息的低分辨率图像中恢复包含足够高频信息的高分辨率图像,网络还结合空间注意力与通道注意力机制,以不平等的方式处理不同尺度下的低分辨率特征。通过与密集残差网络等超分辨率方法在Set5数据集上的对比实验,可有效突出特征图中的高频分量,使网络更好地学习并拟合标签图像的特征信息,提升图像超分辨率重建性能。
Aiming at the problem that the current image super-resolution reconstruction method fails to make full use of the global and local information of the image,which causes the reconstruction result to lose part of the source image information to a certain extent,a multi-scale dense residual network is proposed to achieve this.Super-resolution reconstruction of the image.The network is based on dense residuals and integrates the multi-scale feature information of the image,ensuring that the network does not lose feature information in depth while obtaining more information under different receptive fields,thereby avoiding excessive loss of information in the original image.In addition,in order to recover high-resolution images containing enough high-frequency information from low-resolution images with low-frequency redundant information,the network combines spatial attention and channel attention to process low-resolution features at different scales in an unequal manner.This method can effectively highlight the high-frequency components in the feature map,so that the network can better learn and fit the feature information of the label image,and restore an image that is close to the real image.A large number of experimental results prove the effectiveness of this method.
作者
原铭
李凡
李华锋
张亚飞
YUAN Ming;LI Fan;LI Huafeng;ZHANG Yafei(Kunming University of Science and Technology,School of Information and Automation,kunming 650000,China)
出处
《光学技术》
CAS
CSCD
北大核心
2022年第3期357-363,共7页
Optical Technique
基金
国家自然科学基金项目项目(62161015)
云南省科技厅科技计划项目-基础研究专项(202101AT070136)
云南省重大科技专项(202002AD080001)。
关键词
深度神经网络
超分辨率图像重建
多尺度密集残差网络
注意力
deep neural network
super-resolution image reconstruction
multi-scale dense residual network
attention