期刊文献+

基于注意机制和循环卷积神经网络的细粒度图像分类算法 被引量:5

Fine-Grained Image Classification Algorithm Based on Attention Mechanism and Circular Convolutional Neural Network
下载PDF
导出
摘要 细粒度图像分类是计算机视觉中非常热的研究方向.由于同一个大物种的子类别之间具有相似的外观,相似的颜色,所以差别非常细微.因此,细粒度图像分类非常具有挑战性.为了解决这个挑战,该文提出一种基于注意机制的循环卷积神经网络用于细粒度图像分类.首先,根据注意机制循环提取一幅图像中的显著性物体区域;然后,对原始图像和每次提取的显著性区域分别进行分类;最后,融合分类层得分,进行最终分类.在非常有挑战性的公共数据集CUB-200-2011,Stanford Dogs和Stanford Cars上进行实验,与比较先进的实验方法进行比较,实验结果表明该文提出的方法非常有效. Fine-grained image classification is a hot research field in computer vision.Because subcategories within a large species have similar appearances and similar colors,the differences are subtle.Therefore,fine-grained image classification is very challenging.To solve this problem,an attention-based cyclic convolutional neural network for fine-grained image classification has been proposed in this paper.Firstly,according to the attention mechanism,the region of the significant object in an image is extracted.Secondly,the original image and the significance region of each extraction are classified respectively.And finally,the score of classification layer is fused for final classification.We conduct experiments on very challenging public datasets:CUB 200-2011,Stanford Dogs and Stanford Cars.We compared our method with the state-of-the-art methods,and the experimental results show that our proposed method is very effective.
作者 王伟 吴芳 WANG Wei;WU Fang(College of Information Engineering,Zhengzhou Institute of Technology,Zhengzhou 450044,China;College of Physical and electronic Engineering,Henan Finance University,Zhengzhou 450046,China)
出处 《西南师范大学学报(自然科学版)》 CAS 北大核心 2020年第1期48-56,共9页 Journal of Southwest China Normal University(Natural Science Edition)
基金 河南省重点科技攻关项目(182102210594) 河南省高等学校重点科研项目(18A140013)
关键词 细粒度图像分类 显著性检测 注意机制 卷积神经网络 fine-grained image classification significance detection attention mechanism convolutional neural network
  • 相关文献

参考文献5

二级参考文献16

共引文献173

同被引文献43

引证文献5

二级引证文献18

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部