摘要
道路遗留物是指存在于道路上影响交通安全的非车辆障碍物,因其涵盖的物品种类和范围非常广,若使用普通的目标检测方法对数据量的要求过于庞大,难以在工程中得到实现。为了解决这些问题,论文提出了一种对遗留物的动态检测进行多步骤结合的方法。首先对于检测车采集到的道路遗留物图片进行图像分割,采用全卷积网络(Fully Convolutional Networks,FCN)训练道路信息分割模型,分割结果是待检测的道路上的所有道路信息,包括车辆、路面、行人和遗留物,然后采用RetinaNet算法或YOLO网络(You Only Look Once)训练得到检测模型,从而将车辆和行人的信息通过目标检测方法进行去除,此时图像中仅包含路面和遗留物,最后将图像进行网格分类,将网格划分为遗留物和路面两类,采用支持向量机(Support Vector Machine,SVM)训练得到路面-遗留物模型,最后将连通的遗留物网格合成遗留物的区域并标记在原图中。所提方法可有效解决遗留物的复杂性和背景变化等问题,并在不同的道路环境下保持了较高的可靠性和稳定性。
Road remnants refer to non-vehicle obstacles that exist on the road and affect traffic safety.Because they cover a wide range of items,the requirements for the amount of data are too large if common target detection methods are used,and it is difficult to achieve them in the project.In order to solve these problems,a multi-step combination method for dynamic detection of remnants is proposed.First of all,perform image segmentation on the road remnant images is collected by the detection vehicle,and Fully Convolutional Networks(FCN)are used to train the road information segmentation model.The result of the segmentation is all road information on the road that needs to be detected,including vehicles,Pavement,pedestrians and remnants.Then RetinaNet algorithm or YOLO network(You Only Look Once)is used to train to obtain the detection model,thereby removing the information of vehicles and pedestrians through the target detection method.At this time,the image only contains the road surface and remnants,and then the image is classified into grids,the grid is divided into remnants and road surfaces,and support vector machine(SVM)is used to train to obtain the road-legacy model.Finally,the connected remnants are gridded into the remnants area and marked in the original image.The proposed method can effectively solve the problems of the complexity and background changes of the remnants,and maintains high reliability and stability in different road environments.
作者
侯锐
唐振民
HOU Rui;TANG Zhenmin(College of Computer Science and Engineering,Nanjing University of Science and Technology,Nanjing 210094)
出处
《计算机与数字工程》
2023年第8期1756-1760,1765,共6页
Computer & Digital Engineering