摘要
针对复杂环境与雾霾天气下绝缘子缺陷过小,传统目标检测算法难以识别造成误检、漏检等情况,提出一种以YOLOv7为基础模型并改进的缺陷检测算法。在图像预处理部分采用暗通道先验去雾算法,提高模型对特征的可分辨性与鲁棒性;为提高模型特征提取能力和识别小目标能力,在主干网络结构后端引入双重多尺度注意力机制(Dual Multi Scale Attention Network,DMSANet);为减小模型尺寸,提高模型识别速度,采用基于SwinTransformer改进的C3模块替代E-ELAN模块;在预测部分使用Wise-IOU损失函数,提高模型收敛效率。实验结果表明,DMSANet-YOLOv7算法相较于原YOLOv7算法平均准确率、准确率以及召回率分别提高6.3%、7.9%、12.3%,单张图片检测速度达到12.3 ms,参数量为37.7 M。在提高检测精度的同时确保检测速度和性能的平衡,能够更好地搭载至无人机及其他平台,满足绝缘子及其缺陷的实时动态检测需求。
Aiming at the situation that the insulator defect is too small in complex environment and haze weather and the traditional target detection algorithm is difficult to identify,resulting in false and missing detection,an improved defect detection algorithm based on YOLOv7 model is proposed.In the image preprocessing part,dark channel prior defogging algorithm is used to improve the feature resolution and robustness of the model.In order to improve the ability of feature extraction and small target recognition,Dual Multi Scale Attention Network(DMSANet)mechanism is introduced into the back end of the backbone network structure.In order to reduce the model size and improve the model recognition speed,the improved C3 module based on SwinTransformer is used to replace the E-ELAN module.The Wise-IOU loss function is used in the prediction part to improve the convergence efficiency of the model.The experimental results show that compared with the original YOLOv7 algorithm,DMSANet-YOLOv7 algorithm has improved mAP,accuracy and recall rate by 6.3%,7.9%and 12.3%,respectively.The detection speed of a single image reaches 12.3 ms,and the number of parameters is 37.7 M.While improving the detection accuracy,it can ensure the balance of detection speed and performance,and can be better mounted on drones and other platforms to meet the real-time dynamic detection requirements of insulators and their defects.
作者
王海群
王康
WANG Haiqun;WANG Kang(College of Electrical Engineering,North China University of Science and Technology,Tangshan 063210,China)
出处
《无线电工程》
2024年第6期1431-1439,共9页
Radio Engineering
基金
河北省自然科学基金(F2021209006)。