摘要
多源遥感数据具有互补性和协同性,近年来,基于深度学习的方法已经在多源遥感图像分类中取得了一定进展,但当前方法仍面临关键难题,如多源遥感图像特征表达不一致,融合困难,基于静态推理范式的神经网络缺乏对不同类别地物的适应性。为解决上述问题,提出了基于跨模态Transformer和多尺度动态3D卷积的多源遥感图像分类模型。为提高多源特征表达的一致性,设计了基于Transformer的融合模块,借助其强大的注意力建模能力挖掘高光谱和LiDAR数据特征之间的相互作用;为提高特征提取方法对不同地物类别的适应性,设计了多尺度动态3D卷积模块,将输入特征的多尺度信息融入卷积核的调制,提高卷积操作对不同地物的适应性。采用多源遥感数据集Houston和Trento对所提方法进行验证,实验结果表明:所提方法在Houston和Trento数据集上总体准确率分别达到94.60%和98.21%,相比MGA-MFN等主流方法,总体准确率分别至少提升0.97%和0.25%,验证了所提方法可有效提升多源遥感图像分类的准确率。
Benefited from the complementarity and synergy of multi-source remote sensing data,deep learning-based methods have made significant progress in remote sensing image classification in recent years.Building a powerful multi-source data joint classification model is typically difficult for the following reasons:the feature fusion is hampered by the heterogeneous gap between HSI and LiDAR data;the representation power,efficiency,and interpretability are constrained by the current static inference paradigm.To solve both problems,we propose a Transformer-based fusion network.Specifically,to bridge the heterogeneous gap between HSI and LiDAR data,we design a feature fusion module based on Transformer to exploit the feature interactions between multi-source data.After that,we create a multi-scale dynamic 3D-convolution module to collect the information from different scales and use it to modulate the 3D-convolution kernel.The method was validated with Houston and Trento datasets.The overall accuracy of the proposed method reached 94.60%and 98.21%respectively.Compared with mainstream methods such as MGA-MFN,the overall accuracy of the two datasets was improved by at least 0.97%and 0.25%respectively.The experimental results demonstrate that our method can effectively improve the accuracy of multi-source remote sensing image classification.
作者
高峰
孟德森
解正源
亓林
董军宇
GAO Feng;MENG Desen;XIE Zhengyuan;QI Lin;DONG Junyu(School of Computer Science and Technology,Ocean University of China,Qingdao 266100,China)
出处
《北京航空航天大学学报》
EI
CAS
CSCD
北大核心
2024年第2期606-614,共9页
Journal of Beijing University of Aeronautics and Astronautics
基金
国家重点研发计划(2018AAA0100602)
山东省自然科学基金(ZR2019QD011)。