摘要
This study,grounded in Waxman fusion method,introduces an algorithm for the fusion of visible and infrared images,tailored to a two-level lighting environment,inspired by the mathematical model of the visual receptive field of rattlesnakes and the two-mode cells'mechanism.The research presented here is segmented into three components.In the first segment,we design a preprocessing module to judge the ambient light intensity and divide the lighting environment into two levels:day and night.The second segment proposes two distinct network structures designed specifically for these daytime and nighttime images.For the daytime images,where visible light information is predominant,we feed the ON-VIS signal and the IR-enhanced visual signal into the central excitation and surrounding suppression regions of the ON-center receptive field in the B channel,respectively.Conversely,for nighttime images where infrared information takes precedence,the ON-IR signal and the Visual-enhanced IR signal are separately input into the central excitation and surrounding suppression regions of the ON-center receptive field in the B channel.The outcome is a pseudo-color fused image.The third segment employs five different no-reference image quality assessment methods to evaluate the quality of thirteen sets of pseudo-color images produced by fusing infrared and visible information.These images are then compared with those obtained by six other methods cited in the relevant reference.The empirical results indicate that this study's outcomes surpass the comparative results in terms of average gradient and spatial frequency.Only one or two sets of fused images underperformed in terms of standard deviation and entropy when compared to the control results.Four sets of fused images did not perform as well as the comparison in the QAB/F index.In conclusion,the fused images generated through the proposed method show superior performance in terms of scene detail,visual perception,and image sharpness when compared with their counterparts from other methods.
基金
supported by the National Natural Science Foundation of China(NSFC)under grant numbers 61201368
Jilin Province Science and technology Department key research and development projecty Research and Development(grant no.20230201043GX).