期刊文献+

基于实例归一化的通道注意力模块

Channel Attention Module Based on Instance Normalization
下载PDF
导出
摘要 传统通道注意力模块在构建池化层和下采样层获取特征图通道权重时,不仅增加大量参数而且极大提高了模型的复杂度。为解决上述问题,提出一种实例归一化通道注意力模块(Instance Normalization Channel Attention Module,INCAM),通过采用实例归一化中的缩放变量度量方差来捕捉特征图通道权重,仅增加少量参数即可获得明显的性能增益。在CIFAR-100和CIFAR-10数据集上的大量实验表明,相对于原始的Res Net-50,嵌入INCAM的Res Net-50在Top-1 Error上降低了11.20%,而参数量仅增加了0.12%,并与其它注意力模块相比更加轻量和高效。 Traditional channel attention modules not only increase a large number of parameters but also greatly improve the complexity of the model when constructing pooling and downsampling layers to obtain the channel weights of feature maps.In this paper,an Instance Normalization Channel Attention Module(INCAM)is proposed,which captures the channel weight of the feature map by measuring the variance of the scaling variable in the instance normalization,and a significant performance gain can be achieved by adding only a small number of parameters.Extensive experiments on the CIFAR-100 and CIFAR-10 datasets show that the ResNet-50 embedded with INCAM reduces the Top-1 Error by 11.20%compared to the original ResNet-50,while the number of parameters increases by only 0.12%,and INCAM is lighter and more efficient compared to other attention modules.
作者 苏树智 蒋博文 陈润斌 SU Shu-zhi;JIANG Bo-wen;CHEN Run-bin(School of Computer Science and Engineering,Anhui University of Science&Technology,Huainan Anhui 232001,China)
出处 《计算机仿真》 2024年第1期227-231,共5页 Computer Simulation
基金 国家自然科学基金(61806006) 中国博士后科学基金(2019M660149)。
关键词 注意力模块 卷积神经网络 图像分类 实例归一化 残差网络 Attention module Convolutional neural network Image classification Instance normalization ResNet
  • 相关文献

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部