期刊文献+

基于快速不变卡尔曼滤波的视觉惯性里程计 被引量:4

Visual-inertial odometry based on fast invariant Kalman filter
原文传递
导出
摘要 针对相机定位问题,设计基于深度相机和惯性传感器的视觉惯性里程计,里程计包含定位部分和重定位部分.定位部分使用不变卡尔曼滤波融合多层迭代最近点(ICP)的估计值和惯性传感器的测量值来获得精确的相机位姿,其中ICP的估计误差使用费舍尔信息矩阵进行量化.由于需要使用海量的点云作为输入,采用GPU并行计算以快速实现ICP估计和误差量化的过程.当视觉惯性里程计出现定位失败时,结合惯性传感器数据建立恒速模型,并基于此模型改进随机蕨定位方法,实现视觉惯性里程计的重定位.实验结果表明,所设计的视觉惯性里程计可以获得准确追踪相机且可以进行有效的重定位. This paper designs a visual-inertial odometry based on a depth camera and an inertial sensor for localizing the camera. The odometry contains camera localization and camera relocalization. Camera localization uses the invariant extended Kalman filter(IEKF) to fuse multilevel iteration closest point(ICP) estimates with the measurements from the inertial sensor for obtaining an accurate camera pose, in which, the estimated error of the multilevel ICP is quantized by the fisher information matrix. Because massive points are set as the inputs, GPU parallel computing is used to fast implement multilevel ICP estimation and its error quantization. When the odometry tracks camera failed, a constant velocity model is constructed with the data from the inertial sensor, and the random fern method is improved based on the velocity model to relocate the odometry. The experiment results show that the designed odometry can track the camera accurately and relocalize the camera effectively.
作者 黄伟杰 张国山 HUANG Wei-jie;ZHANG Guo-shan(School of Electrical and Information Engineering,Tianjin University,Tianjin 300072,China)
出处 《控制与决策》 EI CSCD 北大核心 2019年第12期2585-2593,共9页 Control and Decision
基金 国家自然科学基金项目(61473202)
关键词 视觉惯性里程计 不变卡尔曼滤波 多层迭代最近点 随机蕨 GPU并行计算 惯性传感器 visual-inertial odometry invariant extended Kalman filter multilevel iteration closest point random fern GPU parallel computing inertial sensor
  • 相关文献

参考文献4

二级参考文献61

  • 1Fuentes-Pacheco J, Ruiz-Ascencio J, Rendón-Mancha J M.Visual simultaneous localization and mapping: A survey[J].Artificial Intelligence Review, 2012, 43(1): 55-81.
  • 2Merriaux P, Dupuis Y, Vasseur P, et al.Wheel odometry-based car localization and tracking on vectorial map[C]//17th International Conference on Intelligent Transportation Systems.Piscataway, USA: IEEE, 2014: 1890-1891.
  • 3Shen J, Tick D, Gans N.Localization through fusion of discrete and continuous epipolar geometry with wheel and IMU odometry[C]//American Control Conference.Piscataway, USA: IEEE, 2011: 1292-1298.
  • 4Ohno K, Tsubouchi T, Shigematsu B, et al.Differential GPS and odometry-based outdoor navigation of a mobile robot[J].Advanced Robotics, 2004, 18(6): 611-635.
  • 5Nistér D, Naroditsky O, Bergen J.Visual odometry[C]//IEEE Computer Society Conference on Computer Vision and Pattern Recognition, vol.1.Piscataway, USA: IEEE, 2004: 652-659.
  • 6Scaramuzza D, Fraundorfer F.Visual odometry: Part I: The first 30years and fundamentals[J].IEEE Robotics & Automation Magazine, 2011, 18(4): 80-92.
  • 7Fraundorfer F, Scaramuzza D.Visual odometry: Part II: Matching, robustness, optimization, and applications[J].IEEE Robotics & Automation Magazine, 2012, 19(2): 78-90.
  • 8Kerl C, Sturm J, Cremers D.Robust odometry estimation for RGB-D cameras[C]//IEEE International Conference on Robotics and Automation.Piscataway, USA: IEEE, 2013: 3748-3754.
  • 9Huang A S, Bachrach A, Henry P, et al.Visual odometry and mapping for autonomous flight using an RGB-D camera [C]//International Symposium on Robotics Research.2011: 1-16.
  • 10Dryanovski I, Valenti R G, Xiao J.Fast visual odometry and mapping from RGB-D data[C]//IEEE International Conference on Robotics and Automation.Piscataway, USA: IEEE, 2013: 2305-2310.

共引文献28

同被引文献58

引证文献4

二级引证文献26

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部