摘要
针对基于图像的疲劳裂纹检测方法精度受焊缝、涂层等复杂背景因素影响较大的问题,提出了一种基于深度学习的空洞金字塔注意力网络(APA-Net)模型用于疲劳裂纹分割.在传统编解码网络的基础上引入预训练ResNet34模型、密集空洞卷积(DAC)模块、尺度感知金字塔融合(SAPF)模块和注意力门控(AG)机制,极大地提升了模型提取多尺度上下文信息的能力.通过图像裁剪制作了包含多种干扰因素的钢箱梁疲劳裂纹分割数据集,然后利用该数据集对APA-Net,FCN,U-Net,Attention U-Net,U-Net++和CE-Net等经典网络进行测试,结果表明:所提出的APA-Net在复杂背景干扰下对钢箱梁表面图像中的疲劳裂纹提取能力最佳,分割结果的平均交并比达72.2%,比其他经典网络中表现最优的CE-Net的平均交并比提高了约4%.最后通过消融实验讨论了所提模块对裂纹分割精度的影响.
Aiming at the problem that the accuracy of the image-based fatigue crack detection method is greatly affected by complex background factors such as welds and coatings, a deep learning-based atrous pyramid attention network(APA-Net) model was proposed for fatigue crack segmentation. Based on the traditional encoder-decoder network, the pre-trained ResNet34 model,dense atrous convolution(DAC) module,scale-aware pyramid fusion(SAPF) module,and attention gating(AG) mechanism were introduced in the algorithm, which greatly improves the ability of the model to extract multi-scale contextual information. The dataset of fatigue crack segmentation for steel box girders was built by image copping,which contained many background noises.The developed dataset was used to test the APA-Net as well as other classical networks,including the fully convolutional network(FCN),U-Net,Attention U-Net,U-Net++,and context encoder network(CE-Net).The results show that the proposed APA-Net has the best performance for extracting fatigue cracks from high-resolution images of steel box girder captured under complex background conditions,and its segmentation results achieves a mean intersection-over-union of 72.2%,which is 4% higher than that of the CE-Net,the best performer among other classical networks.Finally,the effect of the proposed module on the accuracy of crack segmentation was discussed by ablation experiments.
作者
邓露
香超
王维
曹然
DENG Lu;XIANG Chao;WANG Wei;CAO Rana(College of Civil Engineering,Hunan University,Changsha 410082,China;Hunan Provincial Key Laboratory for Damage Diagnosis of Engineering Structures,Hunan University,Changsha 410082,China)
出处
《华中科技大学学报(自然科学版)》
EI
CAS
CSCD
北大核心
2022年第8期66-72,共7页
Journal of Huazhong University of Science and Technology(Natural Science Edition)
基金
国家自然科学基金资助项目(51778222,51808209,52108136)。