摘要
目的针对不同视点下具有视差的待拼接图像中,特征点筛选存在漏检率高和配准精度低的问题,提出了一种基于特征点平面相似性聚类的图像拼接算法。方法根据相同平面特征点符合同一变换的特点,计算特征点间的相似性度量,利用凝聚层次聚类把特征点划分为不同平面,筛选误匹配点。将图像划分为相等大小的网格,利用特征点与网格平面信息计算每个特征点的权重,通过带权重线性变换计算网格的局部单应变换矩阵。最后利用多频率融合方法融合配准图像。结果在20个不同场景图像数据上进行特征点筛选比较实验,随机抽样一致性(random sample consensus, RANSAC)算法的平均误筛选个数为30,平均误匹配个数为8,而本文方法的平均误筛选个数为3,平均误匹配个数为2。对20个不同场景的多视角图像,本文方法与AutoStitch(automatic stitching)、APAP(as projective as possible)和AANAP(adaptive as-natural-as-possible)等3种算法进行了图像拼接比较实验,本文算法相比性能第2的算法,峰值信噪比(peak signal to noise ratio,PSNR)平均提高了8.7%,结构相似性(structural similarity,SSIM)平均提高了9.6%。结论由本文提出的基于特征点平面相似性聚类的图像拼接算法处理后的图像保留了更多的特征点,因此提高了配准精度,能够取得更好的拼接效果。
Objective Image stitching is a technology for solving the field-of-view(FOV) limitation of images by stitching multiple overlapping images to generate a wide-FOV image. Parallax image stitching remains a challenging problem. Although numerous methods and abundant commercial tools are beneficial in helping users organize and appreciate photo collections, many of these tools fail to provide convincing results when parallax images are given. Parallax image stitching methods with local homography transforms for partitioning cells are the most popular. However, many of these methods have a high rate of wrongly matching feature points and low accuracy in aligning feature points in different viewpoint images. We propose a novel stitching method using the hierarchical agglomerative clustering of feature points with their plane similarity information to improve the precision rate of matching feature points. Method First, we develop a feature point shift algorithm using the clustering results of the feature points with planar information. The scale-invariant transform feature points of all the images with different viewpoints are extracted. The k-nearest neighbors for each feature point are found using the k-d(lk-dimension) tree algorithm. K minimum sample sets are constructed, and each set includes four noncollinear feature points to compute the homography and residual matrices. Second, the planar information similarities of all the feature points are computed in accordance with the residual matrix, and all the feature points are divided into different clusters using the hierarchical agglomerative clustering algorithm. The feature points in each cluster have a common plane and the same homography transformation. If the mean of the residual matrix in one cluster is larger than a threshold, then the feature points in this cluster are labeled as wrongly matching feature points. Third, we propose an image stitching algorithm that partitions an image into cells with blend weights for multiplane images. All images are partitioned into equal-sized cells. The local homography transformation of each cell is computed via linear transformation with blend weights. The weight of each feature point is computed using the plane information of feature points. If one feature point and its cluster cent point have the same planar label, then the weight is equal to 1;otherwise, the weight is equal to their Gaussian kernel radial distance. Lastly, the aligned images are rendered as a panorama using a multiband blending method. Result We compare our feature point detection algorithm with the random sample consensus(RANSAC) algorithm on traditional building and pavilion images. The RANSAC algorithm found 427 and 541 matched feature points;our algorithm found 435 and 589 matched feature points. For the traditional building images, the RANSAC algorithm has six pairs of wrongly matched points;our algorithm has only one pair of wrongly matched points. For the pavilion images, the RANSAC algorithm has up to 20 pairs of wrongly matched points;our algorithm has only one pair of wrongly matched points. On 20 other different scene images, the average number of error feature points detected by the RANSAC algorithm is 30, and that of our method is only 3. The average number of wrongly matched point pairs of the RANSAC algorithm is eight, and that of our method is only two. We compare our image stitching method with three state-of-the-art methods(automatic stitching(AutoStitch), as projective as possible(APAP), and adaptive as-natural-as-possible(AANAP)) on traditional and modern building images. AutoStitch present an obvious seam line and ghosting in the results because of the global homograph. APAP and AANAP have better results with some ghosting. On the 20 different scene images, the peak signal-to-noise ratio and structural similarity index measure of our method are increased by 8.7% and 9.6%, respectively, making it the second best approach. Conclusion This study presents a novel method for high-precision parallax image stitching using feature point clustering. The experiment results show that our method, which constructs local homography transformation to shift feature points with the planar information clustering results, can increase the number of matched feature points, reduce the number of wrongly shifting and wrongly matching feature points, and increase the precision of feature point alignment compared with the state-of-the-art image stitching approaches of AutoStitch, APAP, and AANAP. The results also show that the proposed image stitching algorithm, which partitions images into cells and with blend weights for multiplane images, can achieve better image stitching performance with regard to pixel and image structure indexes than the state-of-the-art image stitching approaches of AutoStitch, APAP, and AANAP.
作者
谢从华
张冰
高蕴梅
Xie Conghua;Zhang Bing;Gao Yunmei(School of Computer Science and Engineering,Changshu Institute of Technology,Changshu 215500,China;School of Computer Science and Technology,Soochow University,Sitzhou 215006,China;Library,Changshu Institute of Technology,Changshu 215500,China)
出处
《中国图象图形学报》
CSCD
北大核心
2020年第6期1180-1189,共10页
Journal of Image and Graphics
基金
国家自然科学基金项目(61772242,61572239,61402204)
国家级大学生实践创新创业训练计划项目(201910333028)
江苏省大学生创新创业训练计划项目(201810333050X,201910333022Z)。
关键词
图像拼接
图像配准
层次聚类
视差图像
局部单应变换
特征点匹配
image stitching
image registration
hierarchical clustering
parallax image
local homography transformation
feature point match