期刊文献+

基于三角网格动静区域分离的改进RGB-D SLAM方法(英文) 被引量:4

Improving RGB-D SLAM through extracting static region using delaunay triangle mesh
下载PDF
导出
摘要 针对机器人导航中动态环境下的同时定位和建图问题,提出了一种可在动态场景下稳定实时运行的RGB-D SLAM方法。通过从静态背景中分割并剔除动态对象,并在系统RANSAC解算过程中提取静态背景中的特征点来估计相机轨迹,使得系统在动态场景下能稳健定位;对当前帧构建Delaunay三角网格,并判断当前帧与参考帧的匹配点对的距离一致性,通过删除动静状态不一致的点对线段来剔除网格中的动态物体;结合带加权的词典方法,通过减小动态物体在动态场景的权重,进一步提高系统精度。实验结果表明,所提出的方法在TUM数据集的高动态序列中准确度相比现有实时SLAM方法提高了81.37%,显著提高了移动机器人在动态场景下的定位精度。 To solve the problem of simultaneous localization and mapping(SLAM)in the dynamic environment of robotics navigation,a real-time RGB-D SLAM approach that can robustly handle high-dynamic environments is proposed.A novel static region extraction method is used to segment the dynamic objects from the static background,and the feature points in the static region are integrated into the RANSAC method to estimate the camera trajectory.The dynamic entities are identified and isolated by discarding the edges in the Delaunay triangle mesh of current frame according to distance-consistency principle in a rigid body.Combined with the weighted Bag-of-Words method,the system accuracy is further improved by reducing the weight of the dynamic object in the dynamic scene.Experimental results demonstrate that,compared with the existing real-time SLAM method,the proposed method improves the accuracy by 81.37%in the high-dynamic sequences of the TUM RGB-D datasets,which significantly improve the accuracy of navigation and positioning of mobile robots in dynamic scenes.
作者 张小国 郑冰清 刘启汉 王庆 阳媛 ZHANG Xiaoguo;ZHENG Bingqing;LIU Qihan;WANG Qin;YANG Yuan(School of Instrument Science and Engineering,Southeast University,Nanjing 210000,China)
出处 《中国惯性技术学报》 EI CSCD 北大核心 2019年第5期661-669,共9页 Journal of Chinese Inertial Technology
基金 国家重点研发计划(2016YFB0502103) 国家自然基金(61601123)
关键词 同步定位与建图 动态场景 静态区域分割 DELAUNAY三角剖分 回环检测 SLAM dynamic environments static region extraction Delaunay triangulation loop detection
  • 相关文献

参考文献2

二级参考文献29

  • 1阮秋琦.数字图像处理学[M].北京:电子工业出版社,2007.
  • 2Maimone M, Cheng Y, Matthies L. Two years of visual odome-try on the Mars exploration rovers [J]. Journal of Field Robotics,2007, 24(3): 169-186.
  • 3Howard A. Real-time stereo visual odometry for autonomousground vehicles [CJ//IEEE/RSJ International Conference on In-telligent Robots and Systems. Piscataway, USA: IEEE, 2008:3946-3952.
  • 4Helmick D, Cheng Y, Clouse D, et al. Path following using vi-sual odometry for a Mars rover in high-slip environments [C]//Proceedings of the IEEE Aerospace Conference, vol.2. Piscat-away, USA: IEEE, 2004: 772-789.
  • 5Nister D. An efficient solution to the five-point relative poseproblem [J]. IEEE Transactions on Pattern Analysis and Ma-chine Intelligence, 2004, 26(6): 756-770.
  • 6Vieville T, Clergue E, Facao P. Computation of ego-motionand structure from visual and inertial sensors using the verti-cal cue[C]//IEEE International Conference on Computer Vision.Piscataway, USA: IEEE,1993: 591-598.
  • 7Kukelova Z, Pajdla T. A minimal solution to the autocalibrationof radial distortion[C]//IEEE Conference on Computer Visionand Pattern Recognition. Piscataway,USA: IEEE, 2007: 1-7.
  • 8Byrod M, Kukelova Z, Josephson K, et al. FAST and robust nu-merical solutions to minimal problems for cameras with radialdistortion[C]//IEEE Conference on Computer Vision and Pat-tern Recognition. Piscataway, USA: IEEE, 2008: 1-8.
  • 9Bujnak M, Kukelova Z, Pajdla T. A general solution to the P4Pproblem for camera with unknown focal length[C]//IEEE Con-ference on Computer Vision and Pattern Recognition. Piscat-away, USA: IEEE, 2008: 9-16.
  • 10Stewenius H, Engels C, Nister D. An efficient minimal solutionfor infinitesimal camera motion[CJ//IEEE Conference Comput-er Vision and Pattern Recognition. Piscataway, USA: IEEE,2007: 1-8.

共引文献25

同被引文献13

引证文献4

二级引证文献30

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部