摘要
为提高注意力机制对深度神经网络准确率的提升效果,提出一种重参数化通道注意力模块(RCAM)。鉴于挤压激励网络的通道压缩方法对网络精度存在较大误差,故提出一种基于重参数化技术的通道重参数化模块,将此模块与注意力机制进行有效结合;按集成策略消融实验所获得的结果,将此注意力模块放置进主干网络中。实验结果表明,在公共数据集CIFAR-100和ImageNet-100,主干网络为RepVGG_A0、ResNet-18时,其准确率分别较未添加注意力机制的网络提升了2.37%和1.72%,1.61%和0.71%,将结果与其它注意力机制进行比较,验证了基于重参数化的注意力机制对主干网络的提升远优于其它方法。
To improve the accuracy of deep neural networks by enhancing the attention mechanism,a re-parameterized channel attention module(RCAM)was presented.It is pointed out that the channel compression method of squeeze-and-excitation networks has a significant impact on the accuracy of the network.Therefore,this parameter reduction method was chosen to be abandoned as the basis.A channel re-parameterization module based on re-parameterization techniques was proposed,and this module was effectively combined with attention mechanisms.According to the integration strategy of ablation experiments,this attention module was placed into the backbone network.Experimental results indicate that on the public datasets CIFAR-100 and ImageNet-100,the RepVGG_A0 backbone network achieves accuracy improvements of 2.37%and 1.72%respectively compared to the networks without the addition of attention mechanisms.Similarly,when using the ResNet-18 backbone network,the accuracy improvements are 1.61%and 0.71%for CIFAR-100 and ImageNet-100 respectively,compared to the networks without attention mechanisms,which is compared to other well-known attention mechanisms,indicating that this attention mechanism is significantly better than several other attention mechanisms in enhancing the performance of the backbone network.
作者
叶汉民
陆泗奇
程小辉
张瑞芳
YE Han-min;LU Si-qi;CHENG Xiao-hui;ZHANG Rui-fang(College of Information Science and Engineering,Guilin University of Technology,Guilin 541006,China;Guangxi Key Laboratory of Embedded Technology and Intelligent System,Guilin University of Technology,Guilin 541004,China;College of Mechanical and Control Engineering,Guilin University of Technology,Guilin 541006,China)
出处
《计算机工程与设计》
北大核心
2024年第10期2960-2969,共10页
Computer Engineering and Design
基金
国家自然科学基金项目(61662017)
广西嵌入式技术与智能系统重点实验室开放(主任)基金项目(2019-01-10)。
关键词
重参数化
注意力机制
通道注意力机制
卷积神经网络
神经网络
图像分类
深度学习
re-parameterization
attention mechanism
channel attention mechanism
convolutional neural network
neural network
image classification
deep learning