摘要
提出一种轻量型多注意力融合网络(multi-attention fusion network,MAFN),在不影响网络性能的前提下,可有效地降低网络参数量和计算复杂度.MAFN由多个多尺度卷积注意力模块构建而成,而每个多尺度卷积注意力模块由支路残差多尺度卷积块、若干个非线性自由激活块及级联式的通道注意力和空间注意力机制组成.多尺度卷积注意力模块通过支路残差多尺度卷积块提取不同视野下的特征信息,并通过非线性自由激活块及注意力机制的融合以增强网络特征表征能力.实验表明,MAFN可有效提升网络性能和降低网络参数模型,且优于其他的先进方法.
A lightweight multi-attention fusion network(MAFN)is introduced to effectively reduce the network parameters and computational complexity without compromising its performance.Specifically,MAFN consists of several stacked multi-scale convolution and attention blocks(MCABs).An MCAB first employs a branch residual multi-scale convolution block(BRMB)to extract useful features from diverse receptive fields.Then,a series of nonlinear free activation blocks(NFABs)are used,which is then followed by a channel attention and spatial attention to improve the feature representation capability of the network.Extensive experiments show that the proposed MAFN effectively enhances the network performance and reduces the network parameters,and is superior to other advanced competitors.
作者
陈新宇
方金生
CHEN Xinyu;FANG Jinsheng(School of Computer Science and Engineering,Minnan Normal University,Zhangzhou,Fujian 363000,China;Key Laboratory of Data Science and Intelligence Application,Fujian Province,Zhangzhou,Fujian 363000,China)
出处
《闽南师范大学学报(自然科学版)》
2023年第4期73-81,共9页
Journal of Minnan Normal University:Natural Science
基金
福建省自然科学基金面上项目(2021J011005)。
关键词
图像超分辨率重建
注意力机制
多尺度卷积模块
轻量型网络
深度学习
image super-resolution reconstruction
attention mechanism
multi-scale convolution module
lightweight network
deep learning