摘要
提出了一种全变分流边与M^(2)GGD概率密度分布相结合的自然图像分割方法。由于自然图像经常受噪声的污染,导致分割的区域结果视觉效果差,而区域间的边界具有较好的非同质区域区分能力,于是提出了利用全变分流来提取边界,并结合M^(2)GGD概率密度分布构建具有空间约束能力更强的图像分割方法。由于其能量最小化是NP难问题,通过设计最大期望最大似迭代优化方法,将待优化模型的区域项和边缘项,分别转化为多层图割模型的t-link以及n-link,并利用最大流/最小割算法,可求得全局近似最优解。最终,通过在合成的噪声污染图像以及自然场景图像上进行实验对比与分析,实验验证了提出的方法具有较好的抗噪能力,较高的量化准确率,且最终分割的结果接近于地面真实分割结果。
An approach of natural image segmentation is proposed based on M^(2)GGD distribution and total variation flow boundary.As the natural image is usually corrupted by some random noise,the finally segmented result with visual effect is poor.However,as edge information can distinguish out the non-homogeneous regions,therefore,an approach is designed by using total variation for extracting boundary,and meanwhile it is integrated with M2GGD distribution to improve the spatially constrained ability for natural image segmentation.For that the optimization problem of the designed energy function is NP hard,and then the region term and edge term is designed as t-link and n-link of multilayer graph cuts respectively,by employing maximum flow/minimum cut for optimization during the process of maximum expectation and maximum likelihood,and then,an approximate optimization solution is required.Finally,some noise corrupted synthetic image and natural scene image are adopted for experiment comparison and analysis,which demonstrates that the proposed approach with some superior advantages,such as anti-noise robustly,high quantified accuracy score,and the finally segmented results are more close to ground truth.
作者
杨勇
郭玲
叶阳东
YANG Yong;GUO Ling;YE Yangdong(School of Information Engineering,Zhengzhou University of Industrial Technology,Xinzheng,Henan 451100,China;School of Information Engineering,Zhengzhou University,Zhengzhou 450000,China;Department of Archives Center,Zhengzhou University of Aeronautics,Zhengzhou 450046,China)
出处
《计算机工程与应用》
CSCD
北大核心
2021年第11期202-210,共9页
Computer Engineering and Applications
基金
国家自然科学基金(61772475,61502432)
河南省教育厅高等学校青年骨干教师培养计划项目(2017GGJS186,2019GGJS279)
郑州市智能交通视频图像感知与识别重点实验室项目(郑科〔2020〕34号)。