Although using convolutional neural networks(CNNs)for computer-aided diagnosis(CAD)has made tremendous progress in the last few years,the small medical datasets remain to be the major bottleneck in this area.To addres...Although using convolutional neural networks(CNNs)for computer-aided diagnosis(CAD)has made tremendous progress in the last few years,the small medical datasets remain to be the major bottleneck in this area.To address this problem,researchers start looking for information out of the medical datasets.Previous efforts mainly leverage information from natural images via transfer learning.More recent research work focuses on integrating knowledge from medical practitioners,either letting networks resemble how practitioners are trained,how they view images,or using extra annotations.In this paper,we propose a scheme named Domain Guided-CNN(DG-CNN)to incorporate the margin information,a feature described in the consensus for radiologists to diagnose cancer in breast ultrasound(BUS)images.In DG-CNN,attention maps that highlight margin areas of tumors are first generated,and then incorporated via different approaches into the networks.We have tested the performance of DG-CNN on our own dataset(including 1485 ultrasound images)and on a public dataset.The results show that DG-CNN can be applied to different network structures like VGG and ResNet to improve their performance.For example,experimental results on our dataset show that with a certain integrating mode,the improvement of using DG-CNN over a baseline network structure ResNet 18 is 2.17%in accuracy,1.69%in sensitivity,2.64%in specificity and 2.57%in AUC(Area Under Curve).To the best of our knowledge,this is the first time that the margin information is utilized to improve the performance of deep neural networks in diagnosing breast cancer in BUS images.展开更多
基金supported by the National Natural Science Foundation of China under Grant Nos.61976012 and 61772060the National Key Research and Development Program of China under Grant No.2017YFB1301100China Education and Research Network Innovation Project under Grant No.NGII20170315.
文摘Although using convolutional neural networks(CNNs)for computer-aided diagnosis(CAD)has made tremendous progress in the last few years,the small medical datasets remain to be the major bottleneck in this area.To address this problem,researchers start looking for information out of the medical datasets.Previous efforts mainly leverage information from natural images via transfer learning.More recent research work focuses on integrating knowledge from medical practitioners,either letting networks resemble how practitioners are trained,how they view images,or using extra annotations.In this paper,we propose a scheme named Domain Guided-CNN(DG-CNN)to incorporate the margin information,a feature described in the consensus for radiologists to diagnose cancer in breast ultrasound(BUS)images.In DG-CNN,attention maps that highlight margin areas of tumors are first generated,and then incorporated via different approaches into the networks.We have tested the performance of DG-CNN on our own dataset(including 1485 ultrasound images)and on a public dataset.The results show that DG-CNN can be applied to different network structures like VGG and ResNet to improve their performance.For example,experimental results on our dataset show that with a certain integrating mode,the improvement of using DG-CNN over a baseline network structure ResNet 18 is 2.17%in accuracy,1.69%in sensitivity,2.64%in specificity and 2.57%in AUC(Area Under Curve).To the best of our knowledge,this is the first time that the margin information is utilized to improve the performance of deep neural networks in diagnosing breast cancer in BUS images.