摘要
针对喷印可变彩色二维码多颜色干扰、高复杂度的缺陷,以及目前印刷企业检测方法精度不够、效率低的问题,提出一种基于ResNet34融合Transformer结构的缺陷分类模型(ResNet34-TE)。首先,构建彩色二维码缺陷数据集,通过基于轮廓形状检测的方法提取目标区域,消除背景干扰;然后,以ResNet34为主干网络进行特征提取,舍弃平均池化层,利用Transformer编码器层对提取特征的全局信息进行捕捉,增加对感兴趣区域的关注度。实验结果表明,ResNet34-TE的准确率达到96.80%,单张平均检测时间为15.59 ms,准确率较原模型提高5.3百分点、检测速度提高5.8%,与经典模型对比综合性能更优;在公开缺陷检测数据集NEU-DET上准确率达到98.86%,优于主流的缺陷分类算法。因此,所提模型在缺陷识别方面有较好的分类效果。
Addressing the defect characteristics of multicolor interference and the high complexity of spray-printed variable color 2D codes,along with the challenges of insufficient accuracy and low efficiency in current detection methods used by printing enterprises,this paper proposes a defect classification model by integrating ResNet34 and Transformer structure(ResNet34-TE).Initially,a color 2D code defect dataset is constructed,followed by the introduction of a contour shape detection method to identify the target region and mitigate background interference.ResNet34 serves as the backbone network for feature extraction.In a significant modification,the average pooling layer is omitted,and a Transformer encoder layer is employed to capture the global information of the extracted features,emphasizing the region of interest.Experimental results demonstrate that the accuracy of ResNet34-TE reaches 96.80%,with the average detection time for a single sheet reduced to 15.59 ms.This represents a 5.3 percentage points improvement in accuracy and a 5.8%enhancement in detection speed compared to the baseline model,outperforming classical models.Additionally,on the public defect detection dataset NEU-DET,the proposed model achieves an accuracy of 98.86%,surpassing mainstream defect classification algorithms.Consequently,the proposed model exhibits superior classification effectiveness in defect recognition.
作者
李莹
董耀
何自芬
袁浩
孙福洋
龚灵茜
Li Ying;Dong Yao;He Zifen;Yuan Hao;Sun Fuyang;Gong Lingxi(Faculty of Mechanical and Electrical Engineering,Kunming University of Science and Technology,Kunming 650500,Yunnan,China)
出处
《激光与光电子学进展》
CSCD
北大核心
2024年第18期98-109,共12页
Laser & Optoelectronics Progress
基金
国家自然科学基金(62171206)。