摘要
视觉目标检测与测距是一种利用计算机视觉技术实现对图像或视频中目标物体进行检测和测距的技术。该技术在工业自动化领域发挥着重要的作用。基于YOLOv5算法,进行目标检测时,采用视差计算算法透视变换来估计每个像素的视差值,并利用训练的视差神经网络模型代替传统三角测量原理实现目标检测和测距,为了加快计算速度,使用最新并行计算框架OpenCL来充分发挥计算设备的并行计算能力,运行效率提高了43%,实现了更准确、更快速的检测目标,并提高了测距精度。首先系统调用双目摄像机检测出目标物品,并得到相应的边界框和置信度,然后利用相机的视角差异,双目视觉的立体匹配功能,计算目标物体的实际距离。实验在室外进行,结果表明该系统能准确检测室外目标障碍物,并实现目标测距,在0.5~1.5 m范围内测距误差不超过4%,为巡检机器人自动避开障碍物提供技术参考。
Visual object detection and ranging is a technology that uses computer vision technology to detect and distance target objects in images or videos.This technology plays an important role in the field of industrial automation.Based on the YOLOv5 algorithm,the parallax computing algorithm perspective transformation was used to estimate the parallax value of each pixel during target detection,and the trained parallax neural network model was used to replace the traditional triangulation principle to achieve target detection and ranging.In order to speed up the calculation,the latest parallel computing framework OpenCL was used to give full play to the parallel computing capability of the computing equipment,operating efficiency was increased by 43%,the target detection was more accurate and faster,and the ranging accuracy was improved.Firstly,the binocular camera was called to detect the target object,and the corresponding boundary frame and confidence were obtained.Then,the actual distance of the target object was calculated by using the camera s angle difference and the stereo matching function of binocular vision.The experiment was carried out outdoors.The results show that the system can accurately detect the outdoor target obstacles and realize the target distance measurement,and the distance measurement error is less than 4%in the range of 0.5~1.5 m,which provides a technical reference for the inspection robot to automatically avoid obstacles.
作者
谭斌
王婷
TAN Bin;WANG Ting(School of Civil Engineering and Architecture,Nanchang Hangkong University,Nanchang 330063,China)
出处
《科学技术与工程》
北大核心
2024年第21期9015-9024,共10页
Science Technology and Engineering
基金
国家自然科学基金(51968051)。