摘要
给出了一种只利用源图像多尺度边缘点进行融合的图像融合算法.该算法分为三步:首先,对源图像进行多尺度边缘检测;其次,采用边缘相关性最大的融合准则对源图像的多尺度边缘进行融合,得到融合图像的多尺度边缘;最后,由融合图像的多尺度边缘重构出融合图像.该算法融合过程中计算量小,融合图像中最大程度地保留了源图像的边缘信息,在一定程度上对融合图像进行了压缩,从而减小了数据存储所占用的资源以及数据传输占用的带宽.仿真结果表明,用该算法得到的融合图像能有效包含源图像的信息.
An image fusion algorithm just using multiscale edges of original images was proposed. There are three steps in this algorithm. Firstly, multiscale edges of the original images were detected. Secondly, these muhiscale edges were fused to get the fusion image's multiscale edges according to the fusion rule, maximizing edge correlation between the fusion image and original images. At last, the fusion image was reconstructed from its multiscale edges. The algorithm using only multiscale edges reduced the computation complexity in the fusion process. The fusion rule enabled the fusion image to hold as much as edge information of original images. Furthermore, the algorithm could compress data even without using any compression coding method, which reduced memory cost and decreased data transfer bandwidth. The experimental results demonstrate that fusion images generated by this algorithm contain the information of original images effectively.
出处
《北京航空航天大学学报》
EI
CAS
CSCD
北大核心
2007年第2期229-232,共4页
Journal of Beijing University of Aeronautics and Astronautics
基金
国家自然科学基金资助项目(60502019)
关键词
图像融合
小波变换
多尺度边缘
边缘相关性
image fusion
wavelet transform
multiscale edges
edge correlation