摘要
当前,基于时序特征提取的农作物遥感分类方法需要较多先验知识及人工干预,难以自动化,且易因忽略部分有效特征而导致精度降低。针对这些问题,提出基于时序光谱重构的卷积神经网络农作物分类法。为充分利用时间序列多光谱中丰富的作物物候与多光谱信息,对每个地面像元构造以时间维为纵轴、光谱维为横轴的时序光谱图,采用Adam梯度下降法与Dropout 40%连接率优化后的卷积神经网络对时序光谱图进行分类。对比实验结果表明,该方法可有效减少“椒盐”噪声的产生,且地块边界轮廓线清晰,总体分类精度达到95.12%,高于时间序列多光谱+随机森林(88.58%)、时间序列NDVI+随机森林(90.25%)、时间序列NDVI+卷积神经网络(91.79%)等对照实验组;对于“异物同谱”现象明显的春玉米与番茄,该方法的F 1-score分别达到95.9%与89.9%,相比各对照组均有较大幅度的提高。该研究结果可为遥感农作物的自动化精细制图提供参考。
At the present,the crop classification method based on time series feature extraction needs much prior knowledge and manual intervention.It is difficult to automate the classification and it is easy to reduce the accuracy due to neglecting some effective features.We propose a convolutional neural network crop classification method based on time series spectral reconstruction.The method constructs a time-series image for every ground pixel,with the time dimension as the vertical axis and the spectral dimension as the horizontal axis,and then uses CNN,optimized by using the Adam gradient descent method and 40%connection rate dropout,to classify the time-series images.The experimental results show that the method reduces the salt-and-pepper noise,and the boundary of the plot is clear.The overall classification accuracy reaches 95.12%,which is higher than those of the time series multispectral+random forest method(88.58%),time-series NDVI+random forest method(90.25%),and time-series NDVI+convolutional neural network method(91.79%).For spring corn and tomato with close spectral similarity,the F 1-scores of our method reach 95.9%and 89.9%respectively,which are significantly improved compared with the control groups.The results of this study provide useful information for the automatic and fine mapping of croplands.
作者
冯齐心
杨辽
王伟胜
陈桃
黄双燕
FENG Qixin;YANG Liao;WANG Weisheng;CHEN Tao;HUANG Shuangyan(Xinjiang Institute of Ecology and Geography,Chinese Academy of Sciences,Urumqi 830011,China;University of Chinese Academy of Sciences,Beijing 100049,China)
出处
《中国科学院大学学报(中英文)》
CSCD
北大核心
2020年第5期619-628,共10页
Journal of University of Chinese Academy of Sciences
基金
国家重点研发计划项目(2017YFB0504204)资助。