期刊文献+

乳腺癌的多模态显微谱像分析及智能融合诊断研究

Multimodal Microscopic Spectrum-Image Analysis and Intelligent Fusion Diagnosis of Breast Cancer
原文传递
导出
摘要 显微成像以及荧光光谱技术是研究乳腺癌组织特性的重要手段,可有效反映组织形态结构和生化成分的变化。本课题组采用自行设计和加工的倒置荧光显微镜,对患者乳腺组织的显微图像和荧光光谱进行同步收集,然后进行明场图像和多波长荧光显微图像以及荧光光谱数据的综合分析和融合。在光谱分析方面,采用高斯拟合模型对荧光光谱进行分峰处理,以探究乳腺癌发展过程中荧光成分的变化规律。研究结果显示,乳腺癌组织在520 nm和635 nm中心波长处的荧光谱带峰面积与470 nm处的峰面积之比(A520/A470和A635/A470)约为正常乳腺组织的1.65倍和2.07倍。因此,本课题组提出将峰面积比A520/A470和A635/A470作为乳腺癌检测的潜在指标。此外,构建了基于显微图像和荧光光谱的谱像融合神经网络,并采用该网络实现了对乳腺癌的智能诊断,最终AUC(受试者工作特征曲线下面积)得分和测试集准确率分别达到了0.95和86.38%,明显优于各单模态模型。本研究结果表明,双模态显微成像和荧光光谱技术在乳腺癌分析、诊断中具有独特优势,结合深度学习构建的谱像融合网络可为乳腺癌智能诊断提供有效途径。 Objective Breast cancer is among the most common malignant tumors and a serious threat to women’s health.Therefore,rapid and efficient screening for breast cancer is increasingly important.Currently,imaging and pathological examinations are the two main methods used for breast cancer diagnosis.Imaging examinations typically have shortcomings such as long examination time and radioactivity.Pathological examination,which is the gold standard for cancer diagnosis,has the disadvantages of complicated preparation and time-consuming processes.Therefore,a new intelligent method must be developed for breast cancer diagnosis to reduce the reliance on traditional techniques.Microscopic imaging and fluorescence spectroscopy are crucial tools for studying the characteristics of cancerous breast tissues and for effectively capturing changes in tissue morphology and biochemical composition.In this study,a self-designed and processed inverted fluorescence microscope was employed to simultaneously collect microscopic images and fluorescence spectra from breast tissues of patients.Methods First,the samples used in this study were obtained from patients with invasive breast cancer.The fresh tissue samples were cut into tissue blocks of approximately 3 mm×3 mm×2 mm size,quickly frozen within liquid nitrogen,and then cut into tissue sections with a thickness of 15μm.The entire process did not require staining of tissue sections.Next,microscopic images and spectra were collected.The collection equipment was designed and customized based on an inverted fluorescence microscope(IX51,Olympus).Bright field and fluorescence imaging modes of different wavelengths were achieved by switching excitation filters and dichroic mirrors and adjusting the light source.A total of 69 sets of multimodal microscopic images and 46 sets of spectral data(divided into purple and blue light excitation)were obtained from the tissue sections of 23 patients.The spectral data were preprocessed using baseline correction and third-order polynomial 30-point Savitzky‒Golay smoothing.Finally,a neural network model was constructed based on multimodal microscopic images and fluorescence spectra that included image feature extraction,spectral feature extraction,and spectral feature fusion.For image feature extraction,multimodal images were stacked into a three-dimensional(3D)matrix,and joint feature extraction was performed with the help of 3D convolutional layers and residual modules.For spectral feature extraction,the fluorescence spectra excited by excitation light of different wavelengths(purple and blue)were first stacked in parallel into a two-dimensional(2D)matrix,and joint feature extraction was achieved with the help of a 2D convolution layer and residual module.The extracted image and spectral features were then combined,and the spectrum‒image fusion features were further explored with the help of a fully connected neural network,ultimately achieving an intelligent diagnosis of breast cancer.Results and Discussions The first step is a multimodal microscopic image analysis.Under purple light excitation,bright field images,blue fluorescence images,and their fusion images show that the extracellular matrix of normal breast tissue has a more uniform fluorescence distribution than that of cancerous breast tissue(Fig.3).Under blue light excitation,bright field images,green fluorescence images,and their fusion images show that the cancerous breast tissue has a strong green fluorescence signal(Fig.4).The fluorescence spectrum analysis shows that the average spectral intensity of normal breast tissue under purple light excitation is significantly stronger than that of cancerous breast tissue,and the fluorescence intensity of cancerous breast tissue under blue light excitation is slightly higher than that of normal tissue(Fig.5).A Gaussian function was used to fit and analyze the spectra excited by purple light.The analysis results show that the area ratios(A_(520)/A_(470) and A_(635)/A_(470))of the fluorescence spectrum peaks at the central wavelengths of 520 nm and 635 nm to that at 470 nm in cancerous breast tissues increase by 0.65 and 1.07 times,respectively,compared to those of normal breast tissues(Table 2).Finally,during intelligent diagnostic analysis,the training and prediction results of the spectrum‒image fusion neural network show that the risk of model over-fitting is extremely low.The calculated area under the curve(AUC)is 0.95,indicating that the model has good classification performance.Further calculations of the test set data show that the average accuracy is 86.38%(Fig.7).At the same time,this study further lists the training results of five types of image and spectral data used alone for breast cancer diagnosis for comparison with the diagnostic results of spectrum‒image fusion data.The results indicate that the spectrum‒image fusion neural network can achieve a significantly higher prediction accuracy than each single-modality model(Table 3).Conclusions In this study,dual-modal microscopic imaging and multiwavelength microfluorescence spectroscopy are combined to diagnose breast cancer to obtain more comprehensive information on compositional changes in breast tissue during canceration.The Gaussian fitting model is used to perform peak splitting analysis on the spectral data.Changes in endogenous fluorophore content during cancerization are discussed,and the peak area ratio is proposed as a potential criterion to diagnose breast cancer.In addition,this study introduces deep learning to construct a spectrum‒image fusion neural network model based on fluorescence microscopic images and microscopic spectra,achieving an AUC score of 0.95 and an accuracy of 86.38%that are significantly higher than those of each single-modality model.This provides a feasible method for the intelligent diagnosis of breast cancer with the advantages of convenience,speed,and clinical significance.
作者 吴青霞 李柏楠 惠紫阳 王子函 李运宏 尚林伟 尹建华 Wu Qingxia;Li Bainan;Hui Ziyang;Wang Zihan;Li Yunhong;Shang Linwei;Yin Jianhua(Department of Biomedical Engineering,College of Automation Engineering,Nanjing University of Aeronautics and Astronautics,Nanjing 210016,Jiangsu,China)
出处 《中国激光》 EI CAS CSCD 北大核心 2024年第15期143-151,共9页 Chinese Journal of Lasers
基金 国家自然科学基金(62375127,62105147) 江苏省重点研发计划(BE2023812) 南京航空航天大学前瞻布局专项基金(ILA-22022)。
关键词 双模态显微成像 多波长荧光光谱 谱像融合 深度学习 智能诊断 dual-modal microscopic imaging multiwavelength fluorescence spectroscopy spectrum‒image fusion deep learning intelligent diagnosis
  • 相关文献

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部