摘要
生成对抗网络在红外与可见光图像融合领域受到广泛关注,但单路径进行融合容易丢失浅层信息、分支路特征提取融合能力有限。提出一种基于多路径生成对抗网络的红外与可见光图像融合方法。在生成器端,利用源图像与导向滤波结果构建3条输入路径提取更多源图像特征信息,以获得细节更丰富的融合图像;然后,卷积层加入掩码注意力机制模块,提升显著信息的提取效率,引入密集连接和残差连接,在提升特征传递效率的同时可获取更多源图像重要特征信息。在鉴别器端,采用双鉴别器估计红外与可见光图像的区域分布,避免单鉴别器网络丢失对比度信息的模态失衡问题。在TNO数据集上进行了实验,实验结果表明,所提算法在5个客观评估指标上4项取得了最好结果,优于多数主流算法,在主观评估方面,所提算法保留了更多的纹理细节信息,具有更好的视觉效果。
Generative adversarial network has received widespread attention in the field of infrared and visible image fusion,but single-path fusion is prone to lose shallow information and the ability to branch feature extraction is limited.This paper proposes a fusion method for infrared and visible images based on multi-path generative adversarial networks.In the generator,three input paths are constructed using the source images and the results of guided image filter to extract more source image feature information to obtain detailed and rich fused images.Then,the convolutional layer adds an extract mask attention module to improve the efficiency of extracting significant information.In addition,dense connections and residual connections are introduced for improving the efficiency of feature transmission and obtaining more important feature information of the source image.In the discriminator,to avoid the modal imbalance problem of losing contrast information in a single discriminator network,dual discriminators are used to estimate the regional distribution of infrared and visible light images.Experiments are performed on the TNO dataset,and the experimental results show that the proposed algorithm achieves the best results in four of the five objective evaluation indicators and outperforms most mainstream algorithms.In terms of subjective evaluation,the proposed algorithm retained more texture detail information and has better visual effects.
作者
许光宇
陈浩宇
张杰
Xu Guangyu;Chen Haoyu;Zhang Jie(School of Computer Science and Engineering,Anhui University of Science and Technology,Huainan 232001,China)
出处
《国外电子测量技术》
2024年第3期18-27,共10页
Foreign Electronic Measurement Technology
基金
国家自然科学基金(61471004)
安徽理工大学博士基金(ZX942)
安徽理工大学研究生创新基金(2022CX2125)项目资助。