期刊文献+

利用轻量化DeepLabV3+进行植被信息提取

Vegetation Information Extraction Using Lightweight DeepLabV3+
下载PDF
导出
摘要 针对利用DeepLabV3+网络进行植被提取参数量过大、计算效率低、空间尺度适应性差等问题,本文以2 m分辨率重庆GF-1D影像为数据源,自制植被样本数据集,提出一种轻量化DeepLabV3+网络模型进行植被提取。该模型以MobileNetV2作为主干网络架构,在保证模型基本性能的前提下通过深度可分离卷积和反向残差模块降低其参数量,优化设计ASPP膨胀系数以增强对于不同尺寸林地、草地的提取能力,最后融合scSE注意力机制模块来获取准确的植被及边缘特征信息。结果表明:(1)轻量化DeepLabV3+网络模型的参数大小从208.7 MB减少到25.28 MB,每批次平均训练时间从7.47 min缩减到1.92 min;(2)模型的均交并比(mean intersection over union,MIoU)、均像素精度(mean pixel accuracy,MPA)、准确率(accuracy,ACC)达到75.8%、86.49%、91.32%。 Aiming at the problems of using DeepLabV3+network to extract vegetation,such as too many parameters,low computational efficiency,and poor spatial scale adaptability,a lightweight DeepLabv3+network model is proposed to extract vegetation using the 2-meter resolution Chongqing GF-1D image as the data source,and the self-made vegetation sample dataset.The model uses MobileNetV2 as the backbone network architecture,on the premise of ensuring the basic performance of the model,its parameters are reduced through the depthwise separable convolution and inverted residual block,and the dilatation coefficient of ASPP is optimized to enhance the extraction ability for different sizes of woodland and grassland,finally,the scSE attention mechanism module is integrated to obtain accurate vegetation and edge feature information.The results show that:(1)The parameter size of the lightweight DeepLabV3+network model is reduced from 208.7 MB to 25.28 MB,and the average training time of each batch is reduced from 7.47 min to 1.92 min;(2)The mean intersection over union(MIoU),mean pixel accuracy(MPA),and accuracy(ACC)of the model reached 75.8%,86.49%and 91.32%.
作者 林娜 周俊宇 何静 郭江 Lin Na;Zhou Junyu;He Jing;Guo Jiang(School of Smart City,Chongqing Jiaotong University,Chongqing 400074,China)
出处 《科技通报》 2023年第8期1-6,18,共7页 Bulletin of Science and Technology
基金 国家重点研发计划项目(2021YFB2600600,2021YFB2600603)。
关键词 DeepLabV3+ MobileNetV2 ASPP 注意力机制 GF-1D 深度学习 DeepLabV3+ MobileNetV2 ASPP attention mechanism GF-1D deep learning
  • 相关文献

参考文献8

二级参考文献87

共引文献93

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部