期刊文献+

融合自监督学习和Vision Transformer的作物病害识别模型

Crop Disease Recognition with Self-Supervised Learning and Vision Transformer
下载PDF
导出
摘要 针对基于深度卷积神经网络的作物病害识别模型存在抗干扰能力不足问题,提出一种融合自监督学习和Vision Transformer的HMLP_TR-ViT模型.首先在预训练阶段使用HMLP分块序列化结构来提高MAE模型的层级信息提取能力;然后在精调阶段引入序列重组操作以消除自注意力计算过程中的无效背景序列块,旨在提高网络编码层的推理速度,使所提模型更加聚焦作物病害区域.在公开数据集Plant-Village和自建数据集PDVD-7上实验,使用HMLP结构后的识别率分别为99.90%和98.37%,序列重组后的识别率分别为99.92%和98.37%;对比DCNNs(ResNet、EfficientNet和ConvNeXt),HMLP_TR-ViT在两个数据集上病害识别性能均达到最优,分别为99.92%和98.46%. In order to solve the problem of insufficient anti-jamming ability of the crop disease recognition model based on depth convolutional neural network,a HMLP_TR-ViT incorporating self-supervised learning and Vision Transformer was proposed.During the pre-training stage,the article uses the HMLP block-serialization structure to enhance the hierarchical information extraction ability of the MAE model.During the fine-tuning stage,the sequence reorganization operation selectively discards meaningless background sequence blocks when comput-ing self-attention,which not only improves the inference speed of the encoding layer network,but also allows the model to focus more on disease areas.In the experiments on PlantVillage and PDVD-7,the recognition rates were 99.90%and 98.37%respectively after using HMLP structure,and 99.92%and 98.37%respectively after sequence recombination.Compared with DCNNs(ResNet,EfficientNet and ConvNeXt),HMLP_TR-ViT had the best dis-ease recognition performance in the two data sets,which were 99.92%and 98.46%respectively.
作者 张广海 许佳炜 夏慧娟 王杨 张辉 段蒙蒙 ZHANG Guanghai;XU Jiawei;XIA Huijuan;WANG Yang;ZHANG Hui;DUAN Mengmeng(School of Big Data&Artificial Intelligence,Wuhu University,Wuhu,Anhui 241000;School of Computer and Information,Anhui Normal University,Wuhu,Anhui 241002)
出处 《绵阳师范学院学报》 2024年第11期93-101,共9页 Journal of Mianyang Teachers' College
基金 安徽省高校优秀青年人才支持计划项目(gxyq2022167).
关键词 作物病害识别 自监督学习 序列重组 Vision Transformer 自注意力 Crop disease recognition Self-supervised learning Sequence reorganization Vision transformer Self-attention
  • 相关文献

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部