期刊文献+

基于激光传感器的无人驾驶汽车动态障碍物检测及表示方法 被引量:34

Dynamic Obstacle Detection and Representation Approach for Unmanned Vehicles Based on Laser Sensor
原文传递
导出
摘要 针对激光传感器在室外环境中检测动态障碍物所遇到的数据处理存在延时、检测结果准确率不高等问题,提出了一种基于3维激光传感器Velodyne和四线激光传感器Ibeo信息融合的动态障碍物检测及表示方法.本方法通过分析处理Velodyne激光数据对无人驾驶汽车四周的动态障碍物进行检测跟踪,对于无人驾驶汽车前方准确性要求较高的扇形区域,采用置信距离理论融合Velodyne激光数据处理信息和Ibeo输出的运动状态信息,较大地提高了对障碍物运动状态的检测准确率,然后根据融合得到的结果对运动障碍物的位置进行延时修正,最终在障碍物占用栅格图上将动态障碍物所占据位置与静态障碍物所占据位置区别标示.本方法不仅可以在室外环境中准确地检测出障碍物运动信息,而且可以消除传感器数据处理延时所带来的动态障碍物位置偏差,更准确地将环境中的动静态障碍物信息用障碍物占用栅格图进行描述.该种方法应用在了自主研发的无人驾驶汽车平台上,大量的实验以及它们在"中国智能车未来挑战赛"中的优异表现证明该方法具备可靠性和准确性. For the data processing delay and inaccurate detection problems of dynamic obstacle detection for laser sensor in outdoor environments, a dynamic obstacle detection and representation approach is proposed based on 3-dimensional laser sensor Velodyne and four-line laser sensor Ibeo. By analyzing and processing the data from Velodyne, this approach accomplishes detection and tracking of dynamic obstacles around the unmanned vehicle. For the sector region in front of unmanned vehicle with high accuracy requirements, this approach adopts confidence distance theory to achieve data fusion of the information processed by Velodyne and the output motion state information provided by Ibeo, significantly improves detection accuracy of obstacle motion state, and performs time-delay revision for the locations of dynamic obstacles based on the fusion result. At last, the occupancy locations of dynamic obstacles and static obstacles are distinguished and marked in the occupancy grid map. This approach can accurately detect the obstacle motion information in outdoor environments, eliminate positional deviation caused by sensor data processing delay and accurately represent the dynamic and static obstacles information in the environment with the occupancy grid map. This approach is applied to our self-developed unmanned vehicle. Large amount of experiments and the outstanding performance of our unmanned vehicle in the“Intelligent Vehicle Future Challenge of China”prove its reliability and accuracy.
出处 《机器人》 EI CSCD 北大核心 2014年第6期654-661,共8页 Robot
基金 国家自然科学基金重大研究计划重点项目(91120307) 国家自然科学基金重大研究计划集成项目(91320301) 国家自然科学基金青年基金项目(61005091)
关键词 无人驾驶汽车 动态障碍物检测 栅格地图 激光传感器 Velodyne Ibeo unmanned vehicle dynamic obstacle detection grid map laser sensor Velodyne Ibeo
  • 相关文献

参考文献11

  • 1Petrovskaya A, Thrun S. Model based vehicle detection andtracking for autonomous urban driving[Jj. Autonomous Robots,2009,26(2/3): 123-139.
  • 2Momemerlo M, Becker J, Bhat S, et al. Junior: The Stanfordentry in the urban challenge[J|. Journal of Field Robotics, 2008,25(9): 569-597.
  • 3Ferguson D, Darms M, Urmson C, et al. Detection, predic-tion, and avoidance of dynamic obstacles in urban environ-ments[C]//lEEE Intelligent Vehicles Symposium. Piscataway,USA: IEEE, 2008: 1149-1154.
  • 4Urmson C, Anhalt J, Bagnell D, et al. Autonomous driving inurban environments: Boss and the urban challengejj]. Journalof Field Robotics, 2008,25(8): 425-466.
  • 5Mertz C, Navarro-Serment L E, MacLachlan R, et al. Mov-ing object detection with laser scanners[J]. Journal of FieldRobotics, 2013,30(1): 17-43.
  • 6Dorai C, Wang G,Jain A K, et al. Registration and integra-tion of multiple object views for 3D model construction|J].IEEE Transactions on Pattern Analysis and Machine Intelli-gence, 1998,20(1): 83-89.
  • 7Hirnmelsbach M, Muller A,Liittel T, et al. LIDAR-based 3D ob-ject perception [C ]//Proceedings of 1st International Workshopon Cognition for Technical Systems. 2008.
  • 8Pears N E. Feature extraction and tracking for scanning rangesensors[J]. Robotics and Autonomous Systems, 2000, 33(1):43-58.
  • 9Tubbs J D. A note on binary template matching[J].PatternRecognition, 1989,22(4): 359-365.
  • 10Petrovskaya A, Thrun S. Efficient techniques for dynamic vehi-cle detection[Cj//l 1th International Symposium on Experimen-tal Robotics. Berlin, Germany: Springer, 2009: 79-91.

共引文献75

同被引文献194

引证文献34

二级引证文献209

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部