期刊文献+

基于显著矩阵与神经网络的红外与可见光图像融合 被引量:13

Infrared and Visible Image Fusion Based on Significant Matrix and Neural Network
原文传递
导出
摘要 针对红外与可见光图像融合过程中出现的细节损失严重、视觉效果不佳等问题,提出了基于多尺度几何变换模型的融合方法。首先,采用改进的视觉显著性检测算法对红外与可见光图像进行显著性检测,并构建显著性矩阵;然后,对红外与可见光图像进行非下采样剪切波变换,得到相应的低频和高频子带,并采用显著性矩阵对低频子带进行自适应加权融合,同时采用简化的脉冲耦合神经网络并结合多方向拉普拉斯能量和对高频子带进行融合处理;最后,通过逆变换得到融合图像。实验结果表明,该方法能够有效提升融合图像的对比度并保留源图像的细节信息,融合图像具有良好的视觉效果,且多个客观评价指标均表现良好。 In view of the serious detail loss and poor visual effect in the process of infrared and visible image fusion,a fusion method based on the multi-scale geometric transformation model is proposed.First,the improved visual saliency detection algorithm is used to detect the significance of infrared and visible images and construct the saliency matrix.Then,the infrared and visible images are transformed by the non-subsampled shearlet transform to obtain the corresponding low-frequency and high-frequency subbands.Simultaneously,the low-frequency subbands are adaptively weighted by the saliency matrix and the high-frequency subbands are fused by the simplified pulse coupled neural network combined with the multi-direction sum-modified-Laplacian.Finally,the fusion image is obtained by inverse transformation.The experimental results show that this method can effectively improve the contrast of the fusion image and retain the details of the source image.The fusion image has a good visual effect and performs well in a variety of objective evaluation indicators.
作者 沈瑜 陈小朋 苑玉彬 王霖 张泓国 Shen Yu;Chen Xiaopeng;Yuan Yubin;Wang Lin;Zhang Hongguo(School of Electronic and Information Engineering,Lanzhou Jiaotong University,Lanzhou,Gansu 730070,China)
出处 《激光与光电子学进展》 CSCD 北大核心 2020年第20期68-78,共11页 Laser & Optoelectronics Progress
基金 国家自然科学基金(61861025)。
关键词 图像处理 图像融合 显著性检测 非下采样剪切波变换 脉冲耦合神经网络 image processing image fusion saliency detection non-subsampled shearlet transform pulse coupled neural networks
  • 相关文献

参考文献5

二级参考文献42

共引文献71

同被引文献135

引证文献13

二级引证文献36

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部