摘要
互联网技术的发展使得信息载体形式逐渐多元丰富,其中图像作为重要的信息载体形式能有效实现信息的交流与传达.但当前多数图像质量评价场景均为盲图像,其图像特征的提取精度以及与人类视觉系统的拟合程度依然是重要的研究内容.故该文基于原盲图像提取过程中受核函数受限问题而造成的低精度问题,通过引入深度残差回归网络和图像置信度,并借助图像子块的划分、非均匀步长的引入以及基于亮度和对比度的置信区间设置来获得图像的预测质量分数.结果表明,该盲图像质量评价算法具有较好的泛化性能,且在不同尺寸划分下的失真图像预测结果中均有超过3%的分数提升效果.SROCC值和PLCC值均在0.9以上,RMSE值均低于其他算法,主客观评价质量具有较高的一致性.该方法能有效为符合人眼观测的盲图像预测质量评价提供新的手段,拓宽其评价工作深度.
With the development of internet technology,the forms of information carriers are gradually diversified and rich,and image as an important carrier evaluation form can effectively realize the exchange and transmission of information.However,most of the current image quality evaluation scenarios are blind images,and the accuracy of image feature extraction and the degree of fitting with the human visual system are still important research content.Therefore,based on the low accuracy problem caused by the kernel function limitation in the original blind image extraction process,the prediction quality score of the image was obtained by introducing the depth residual regression network and image confidence,and by using the image sub-block division,the introduction of non-uniform step size and the confidence interval setting based on brightness and contrast.The results showed that the proposed blind image quality evaluation algorithm ha good generalization performance,and has more than 3%improvement in the prediction results of distorted images under different size divisions.SROCC value and PLCC value were both above 0.9,and RMSE value was lower than other algorithm.The subjective and objective evaluation quality has high consistency.This method can effectively provide a new means for evaluating the quality of blind image prediction in line with human eye observation,and broaden its evaluation depth.
作者
齐博
张国华
于立子
QI Bo;ZHANG Guohua;YU Lizi(Department of Big Data and Computer Science,Northeast Petroleum University Qinhuangdao,Qinhuangdao Hebei 066004,China)
出处
《西南师范大学学报(自然科学版)》
CAS
2023年第7期21-30,共10页
Journal of Southwest China Normal University(Natural Science Edition)
基金
黑龙江省自然科学基金(LH2019F042).