摘要
为降低雾天对输电线路巡检图像造成的影响,针对目前主流去雾算法存在计算成本高、图像去雾后检测性能差和难以部署的问题,提出了一种雾天输电线路巡检图像去雾方法Diff-EaT。该方法采用融合Transformer的扩散模型结构,为降低体征提取模块中多头自注意力在ViT中的计算复杂度,使用多头外部注意力代替多头自注意力以减少计算负荷和增强特征学习。同时,设计了一个混合尺度门控前馈网络,在输入特征的深度可分离卷积之后集成了选通机制以改善局部信息捕获。在合成数据集和真实数据集上进行对比试验,定量指标和定量指标都证明其有效性,复原图像细节更加清晰。在去雾检测系统中,对真实巡检图像去雾后使用YOLOv7进行检测,mAP@0.5、召回率、查准率分别提升6.92%、9.58%、4.11%,本文方法去雾后有效提高检测置信度,去雾检测系统可应用于实际场景。同时在消融实验中,证明了改进的有效性。
To reduce the impact of foggy days on transmission line inspection images,a foggy day transmission line inspection image de-fogging method,Diff-EaT,is proposed for the current mainstream de-fogging algorithms that have high computational costs,poor detection performance after image de-fogging,and difficult to deploy.The method adopts a fusion Transformer′s diffusion model structure,and to reduce the computational complexity of the multi-head self-attention in the feature extraction module in the ViT,multi-head external attention is used instead of multi-head self-attention to reduce the computational load and enhance feature learning.Meanwhile,a mixed-scale gated feed-forward network is designed to integrate a pick-and-pass mechanism after the depth-separable convolution of input features to improve local information capture.Tested on synthetic and real datasets,quantitative and quantitative metrics prove their effectiveness with clearer details of recovered images.In the defogging detection system,the real inspection images are defogged and then detected using YOLOv7,mAP@0.5,recall rate,and checking accuracy rate are improved by 6.92%,9.58%,and 4.11%,respectively,and this paper′s method effectively improves the detection confidence after defogging.Defogging detection systems can be applied in real-world scenarios.Also in ablation experiments to demonstrate the effectiveness of its improvements.
作者
周景
田兆星
王满意
Zhou Jing;Tian Zhaoxing;Wang Manyi(School of Control and Computer Engineering,North China Electric Power University,Beijing 102206,China)
出处
《电子测量技术》
北大核心
2024年第15期144-152,共9页
Electronic Measurement Technology
基金
国家自然科学基金(52179014)项目资助。