摘要
为提高图像质量客观评价与主观评价的相关性,提出了一种基于图像内容和双目特性的立体图像质量评价方法。首先,分别对参考和失真图像的频域信息加权,提取感兴趣区域作为权重进行融合计算,得到基于图像内容的评价值;然后,利用拉普拉斯金字塔和双目加权模型实现对左右视点图像的逐层分解、融合,并重构合成图,得到基于双目特性的评价值;最后,结合两者得到立体图像质量评价值。以LIVE3D图像库为样本,将该方法与主观评价值作相关性分析。在相同条件下,对于五种失真类型的立体图像,其Spearman等级相关系数总体优于现有算法。结果表明,该评价方法对立体图像质量的预测结果与主观评价值具有较高的一致性。
In order to improve the correlation between objective evaluation and subjective evaluation of image quality,a stereo image quality evaluation method based on image content and binocular characteristics was proposed.Firstly,the frequency domain information of the reference and distortion images separately were weighted,and the interest area was extracted as a weight for fusion calculating,and then the value based on the image content was obtained.Secondly,the Laplacian pyramid and the binocular weighting model were used to realize the layer-by-layer decomposition and fusion of the left and right viewpoint images,and then the composite map was reconstructed to obtain the value based on the binocular characteristics.LIVE3D image library was taken as a sample,the relevance of the proposed method to subjective assessments was analyzed.For the five types of stereo images,such as Gaussian blur,fast fading,white noise,JPEG2000 compression and JPEG compression,the Pearson linear correlation coefficient and the Spearman rank correlation coefficient are better than those from the existing algorithms under the same conditions.The results show that the prediction results of stereo image quality have higher consistency with subjective assessment values.
作者
王杨
向秀梅
卢嘉
郁振鑫
WANG Yang;XIANG Xiu-mei;LU Jia;YU Zhen-xin(College of Electronics and Information Engineering,Hebei University of Technology,Tianjin Key Laboratory of Electronic Materials&Devices,Tianjin 300401,China)
出处
《科学技术与工程》
北大核心
2019年第33期314-318,共5页
Science Technology and Engineering
基金
河北省自然科学基金青年项目(F2014202036)
教育部人文社会科学研究项目(15YJA630108)资助
关键词
立体图像质量评价
感兴趣区域
视觉特性
拉普拉斯金字塔
合成图
stereoscopic image quality evaluation
interest area
visual characteristics
Laplacian pyramid
cyclopean image