期刊文献+

考虑局部方差互信息和梯度一致性的改进SFIM遥感图像融合方法

An Improved SFIM Remote Sensing Image Fusion Method Considering Local Variance Mutual Information and Gradient Consistency
原文传递
导出
摘要 全色图像和多光谱图像由于光谱和空间尺度上的差异,融合结果容易出现光谱失真或空间失真。如何同时实现两个尺度上的对齐,是提高融合效果的关键。传统的SFIM(Smoothing Filter-based Intensity Modulation)遥感图像融合方法可以保证光谱尺度上的一致,但在衡量空间尺度一致上还不够精确。针对此问题,本文提出了一种基于局部方差互信息的空间尺度对齐方法,并在平均梯度一致性的约束下进一步改进SFIM方法。该方法首先对多光谱各波段线性拟合生成多光谱强度图像,并对高分辨率全色图像进行高斯低通滤波,改变滤波参数循环计算2幅图像的局部方差图像间的互信息,当互信息最大时,高斯滤波参数为最佳滤波估计参数;然后,用该高斯滤波器卷积高分辨率全色图像,得到与多光谱图像空间尺度一致的低分辨率全色图像;之后,高低分辨率全色图像间比值处理得到细节图像,以高分辨率全色图像平均梯度为基准,引入调节系数控制细节图像的注入量;最后,细节图像、调节系数与多光谱图像相乘得到融合图像。为验证本文方法的有效性,在IKONOS和Quickbird两种数据集的植被区、建筑区和混合区3个不同场景六组图像开展融合实验。实验结果表明,对于KONOS数据,本文方法3组实验在光谱保持指标SAM上均位于第二,信息量EN有2组第一,对于Quickbird数据,本文方法3组实验在SAM、EN和AG3个指标上均为最优,具有较好的光谱保持能力和信息丰富程度;空间信息保持指标SCC值虽非最佳,但与SCC值最佳的AGSFIM方法相比,4组实验的SAM、EN和AG这3个指标整体明显优于对方,而与SCC值相近的GSA或SFIM方法对比,本文方法在6组实验中其他3个指标的也均优于对方,平均提升了13.39%、39.52%和34.03%。同时,本文方法在目视效果上也有较为不错的表现,融合真彩色图像与原始真图像彩色差异较小,图像清晰度基本近似全色图像。融合场景方面,本文方法对于植被为主或混合区域,光谱保持优势明显,图像信息量较为丰富;以建筑为主的场景,融合结果的光谱、信息的丰富程度和清晰度也具有不错效果。 Due to differences in spectral and spatial scales,the fusion results of panchromatic and multispectral images often have spectral or spatial distortion.How to achieve alignment on both scales simultaneously is crucial for enhancing fusion performance.The traditional Smoothing Filter-based Intensity Modulation(SFIM)remote sensing image fusion method can ensure consistency on the spectral scale but is not precise enough in measuring spatial scale consistency.To address this issue,this paper proposes a spatial scale alignment method considering local variance mutual information and further improves the SFIM method with the constraint of average gradient consistency.This method first linearly fits each band of the multispectral images to generate an intensity image and applies Gaussian low-pass filtering to high-resolution panchromatic images.By iteratively calculating the mutual information between the local variance images of the two images,the optimal filtering estimation parameters are determined when the mutual information is maximized.Then,the Gaussian filter is used to convolve the high-resolution panchromatic image,obtaining a low-resolution panchromatic image that matches the spatial scale of the multispectral images.The detail image is obtained by processing the ratio between the high-resolution and low-resolution panchromatic images.Based on the average gradient of high-resolution panchromatic images,an adjustment coefficient is introduced to control the amount of detail injection.Finally,the fusion image is obtained by multiplying the detail image,modulation factor,and multispectral image.To validate the effectiveness of this method,fusion experiments are conducted on six sets of images from three different scenes:vegetation area,building area,and mixed area of the IKONOS and Quickbird datasets.For the IKONOS data,the three experimental groups of our method all rank second in terms of spectral retention index SAM,and the information content EN ranks first in two groups.For the Quickbird data,the proposed method performs best in terms of SAM,EN,and AG indices in all three sets of experiments,demonstrating good spectral preservation and information richness.The proposed method outperforms the AGSFIM method in terms of SAM,EN,and AG in four sets of experiments,though the AGSFIM method obtains the highest spatial information preservation index SCC.Compared with the GSA or SFIM methods with similar SCC values,the proposed method shows an average improvement of 13.39%,39.52%,and 34.03%for the other three indicators in the six experiments.In terms of fusion scenes,the proposed method performs well in scenes where vegetation or mixed areas dominate,while scenes dominated by buildings show the advantages of the proposed method in spectral methods.The abundance and clarity of image information also satisfactory,especially in fusion scenes with a higher proportion of vegetation.Moreover,the proposed method also exhibits good visual appearance.There is minimal color difference between the fused true-color image and the original true image and acomparable image clarity against the panchromatic image.In terms of fusion scenes,the method in this paper demonstrates a clear advantage in spectral preservation for vegetation-dominated or mixed areas,with a relatively rich amount of image information;for scenes dominated by buildings,the fusion results also show good performance in terms of spectral richness,information content,and clarity.
作者 王淑香 金飞 林雨准 左溪冰 刘潇 WANG Shuxiang;JIN Fei;LIN Yuzhun;ZUO Xibing;LIU Xiao(Institute of Geospatial Information,Information Engineering University,Zhengzhou 450001,China)
出处 《地球信息科学学报》 EI CSCD 北大核心 2024年第3期693-708,共16页 Journal of Geo-information Science
基金 河南省自然科学基金面上项目(222300420592)。
关键词 局部方差图像 互信息 高斯滤波 平均梯度 细节注入 SFIM模型 遥感 图像融合 local variance image mutual information gaussian filtering average gradient detail injection SFIM model remote sensing image fusion
  • 相关文献

参考文献12

二级参考文献114

共引文献126

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部