摘要
眼底视网膜血管分割在多种类型眼科疾病的评估和诊断中起着重要作用。由于眼底图像中血管的拓扑结构复杂多变,现有算法通常存在分割结果中血管特征不连续以及血管边缘分割准确度不高的问题。针对上述问题,本文提出一种用于视网膜血管分割的多尺度全局注意力U型神经网络MSGA-UNet。该网络一方面通过全局特征注意力模块从编码器中较为容易地获得图像的全局表征信息,解决眼底视网膜血管分割中特征不连续的问题;另一方面利用多尺度空洞卷积模块,利用不同膨胀率的空洞卷积扩大感受野并获取图像的多尺度局部特征信息,从而提升血管边缘信息的提取能力。经过在DRIVE、STARE和CHASEDB1数据集上的实验,MSGA-UNet的平均交并比分别为74.06%、78.22%和79.62%;类别平均像素准确率分别为80.39%、84.60%和85.53%;精确度分别为96.32%,96.42%和97.23%;综合分割性能优于其他模型。
Retinal vessel segmentation plays an important role in the evaluation and diagnosis of many types of ophthalmic diseases.Due to the complex and variable topological structure of blood vessels in fundus images,existing algorithms often suffer from discontinuous vascular features in segmentation results and low accuracy in vascular edge segmentation.In response to the above issues,this paper proposes a multi-scale global attention U-shaped neural network MSGA-UNet for retinal vessel segmentation.On the one hand,this network easily obtains the global representation information of the image from the encoder through the global feature attention module,solving the problem of feature discontinuity in retinal vessel segmentation of the fundus;On the other hand,utilizing multi-scale dilated convolution modules,dilated convolutions with different dilation rates are used to expand the receptive field and obtain multi-scale local feature information of the image,thereby improving the ability to extract vascular edge information.After experiments on the DRIVE,STARE,and CHASEDB1 datasets,the average intersection to union ratios of MSGA-UNet were 74.06%,78.22%,and 79.62%,respectively;The average pixel accuracy of each category is 80.39%,84.60%,and 85.53%,respectively;The accuracy is 96.32%,96.42%,and 97.23%respectively;The comprehensive segmentation performance is superior to other models.
作者
麻文静
王雪津
邢树礼
毛国君
MA Wenjing;WANG Xuejin;XING Shuli;MAO Guojun(Key Laboratory of Big Data Mining and Application in Fujian Province,Fujian University of Technology,Fuzhou Fujian 350118)
出处
《软件》
2024年第1期21-24,37,共5页
Software
基金
国家自然科学基金项目(61773415)
国家重点研发计划项目(2019YFD0900905)。