期刊文献+
共找到1篇文章
< 1 >
每页显示 20 50 100
Temporal sequence Object-based CNN(TS-OCNN) for crop classification from fine resolution remote sensing image time-series 被引量:3
1
作者 Huapeng Li Yajun Tian +2 位作者 Ce Zhang Shuqing Zhang Peter MAtkinson 《The Crop Journal》 SCIE CSCD 2022年第5期1507-1516,共10页
Accurate crop distribution mapping is required for crop yield prediction and field management. Due to rapid progress in remote sensing technology, fine spatial resolution(FSR) remotely sensed imagery now offers great ... Accurate crop distribution mapping is required for crop yield prediction and field management. Due to rapid progress in remote sensing technology, fine spatial resolution(FSR) remotely sensed imagery now offers great opportunities for mapping crop types in great detail. However, within-class variance can hamper attempts to discriminate crop classes at fine resolutions. Multi-temporal FSR remotely sensed imagery provides a means of increasing crop classification from FSR imagery, although current methods do not exploit the available information fully. In this research, a novel Temporal Sequence Object-based Convolutional Neural Network(TS-OCNN) was proposed to classify agricultural crop type from FSR image time-series. An object-based CNN(OCNN) model was adopted in the TS-OCNN to classify images at the object level(i.e., segmented objects or crop parcels), thus, maintaining the precise boundary information of crop parcels. The combination of image time-series was first utilized as the input to the OCNN model to produce an ‘original’ or baseline classification. Then the single-date images were fed automatically into the deep learning model scene-by-scene in order of image acquisition date to increase successively the crop classification accuracy. By doing so, the joint information in the FSR multi-temporal observations and the unique individual information from the single-date images were exploited comprehensively for crop classification. The effectiveness of the proposed approach was investigated using multitemporal SAR and optical imagery, respectively, over two heterogeneous agricultural areas. The experimental results demonstrated that the newly proposed TS-OCNN approach consistently increased crop classification accuracy, and achieved the greatest accuracies(82.68% and 87.40%) in comparison with state-of-the-art benchmark methods, including the object-based CNN(OCNN)(81.63% and85.88%), object-based image analysis(OBIA)(78.21% and 84.83%), and standard pixel-wise CNN(79.18%and 82.90%). The proposed approach is the first known attempt to explore simultaneously the joint information from image time-series with the unique information from single-date images for crop classification using a deep learning framework. The TS-OCNN, therefore, represents a new approach for agricultural landscape classification from multi-temporal FSR imagery. Besides, it is readily generalizable to other landscapes(e.g., forest landscapes), with a wide application prospect. 展开更多
关键词 Convolutional neural network Multi-temporal imagery Object-based image analysis(OBIA) Crop classification fine spatial resolution imagery
下载PDF
上一页 1 下一页 到第
使用帮助 返回顶部