摘要
通过肺部CT影像进行肺结节检测是肺癌早期筛查的重要手段,而候选结节的假阳性筛查是结节检测的关键部分。传统的结节检测方法严重依赖先验知识,流程繁琐,性能并不理想。在深度学习中,卷积神经网络可以在通用的学习过程中提取图像的特征。该文以密集神经网络为基础设计了一个三维结节假阳性筛查模型—三维卷积神经网络模型(TDN-CNN)。首先利用U-Net提取CT图像的肺实质再截取以结节为中心的VOI,通过平移和翻转扩充正样本数据;在3维假阳性筛查网络中,通过稠密连接强化特征利用、扩大特征空间,采用瓶颈层降低参数冗余,训练中优化参数,最终获取最优模型。与2D CNN相比,该模型充分利用了肺结节的三维空间特征。该3D CNN在公开的LIDC数据集上的CPM得分达到0.840,显著高于其他几种3D模型。实验结果证明了该模型的有效性,其适用于肺结节的假阳性筛查。
Pulmonary nodule detection through computer tomography images is the most primary way of early detection for lung cancer, and false positive screen is one of the most vital steps in automatic pulmonary nodule detection. Traditional nodule detection methods rely heavily on prior knowledge, cumbersome process and unsatisfactory performance. In deep learning, convolutional neural network can acquire image features in general learning process. In this paper, a 3 D convolutional neural networks(TDN-CNN) for reducing false positives of pulmonary nodules based on DenseNet is proposed. Firstly, the U-Net is used to segment the lung area, and then intercept the VOI centered on the pulmonary nodules and expand them by translation and flip. In the 3 D CNN,the dense connection is used to achieve feature reuse and features reduction through the transfer layer, tune and optimized parameters in the training, and finally obtain the optimal model. Compared with 2 D CNNs, it makes full use of the three-dimensional spatial features of pulmonary nodules. On the publicly available LIDC dataset, the 3 D CNN achieves CPM of 0.840,which is significantly higher than that of other 3 D CNNs. The experimental results demonstrate the effectiveness of the model, which is suitable for reducing false positives in pulmonary nodule detection systems.
作者
杨靖祎
谢洋
周晓叶
陈隆鑫
底涛
YANG Jing-yi;XIE Yang;ZHOU Xiao-ye;CHEN Long-xin;DI Tao(Data Center,Second Hospital of Hebei Medical University,Shijiazhuang 050051,China;Information Center,Second Hospital of Hebei Medical University,Shijiazhuang 050051,China)
出处
《计算机技术与发展》
2022年第2期196-201,206,共7页
Computer Technology and Development
基金
国家自然科学基金(61702347)。