摘要
针对织物疵点面积小且长宽比跨度大的问题,提出一种基于改进Faster R-CNN的多种织物疵点检测算法。以Faster R-CNN检测算法为基础,选取优化后的ResNet50作为Faster R-CNN的主干网络,在保持ResNet50深度不变的情况下,拓宽残差结构宽度,通过调整网络部分层结构并优化网络参数,使网络提取更多特征信息并减少网络计算量。针对织物疵点检测精度低的问题,在Faster R-CNN中引用FPN网络进行多尺度预测,并将改进的K-means聚类算法生成的预测框代替原Faster R-CNN中人工设计的预测框,增强网络聚焦“小目标”疵点的特征能力,进一步提高疵点检测精度。实验结果表明:相较于原Faster R-CNN,基于改进的Faster R-CNN在平均精度上提高了6.6%,且对于“小目标”与“细长型”疵点,识别率分别高达95%与97%,在织物疵点检测中具有较好的应用价值。
In order to solve the problem that fabric defect area is small and length-width ratio is large,an improved algorithm for fabric defect detection based on Faster R-CNN was proposed.Based on Faster R-CNN detection algorithm,the optimized ResNet50 was selected as the backbone network of Faster R-CNN.The width of residual structure was widened while the depth of ResNet50 remained unchanged.The hierarchical structure of the network was adjusted and the parameters were optimized.The network can extract more feature information and reduce the network computation.To solve the problem of low detection accuracy of fabric defects with"small target",FPN network was used for multi-scale prediction in Faster R-CNN,and the prediction box generated by improved K-means clustering algorithm replaces the artificially designed prediction box in original Faster R-CNN,so as to enhance the feature ability of network to focus on defects with"small target",and further improve the accuracy of defect detection.The experimental results show that compared with original Faster R-CNN,the improved Faster R-CNN can improve the average accuracy by 6.6%,and the identification rate for"small target"and"thin"defects is as high as 95%and 97%,respectively,which has good application value in fabric defect detection.
作者
孙旋
高小淋
曹高帅
SUN Xuan;GAO Xiaolin;CAO Gaoshuai(College of Mechanical and Control Engineering,Guilin University of Technology,Guilin,Guangxi 541004,China;Department of Electrical Engineering,Zhengzhou Electric Power Vocational and Technical College,Zhengzhou,Henan 451450,China)
出处
《毛纺科技》
CAS
北大核心
2022年第12期77-84,共8页
Wool Textile Journal
基金
国家自然科学基金项目(52065016)。