期刊文献+

基于紧耦合的视觉惯性定位方法 被引量:1

Visual inertial positioning method based on tight coupling
下载PDF
导出
摘要 惯性测量单元(IMU)受自身温度、零偏、振动等因素干扰,积分时位姿容易发散,并且机器人快速移动时,单目视觉定位精度较差,为此研究了一种基于紧耦合的视觉惯性即时定位与地图构建(SLAM)方法.首先研究了视觉里程计(VO)定位问题,为减少特征点的误匹配,采用基于快速特征点提取和描述的算法(ORB)特征点的提取方法.然后构建IMU的数学模型,使用中值法得到运动模型的离散积分.最后将单目视觉姿态与IMU轨迹对齐,采用基于滑动窗口的非线性优化得到机器人运动的最优状态估计.通过构建仿真场景以及与单目ORB-SLAM算法对比两个实验进行验证,结果表明,该方法优于单独使用VO,定位精度控制在0.4 m左右,相比于传统跟踪模型提高30%. The inertial measurement unit(IMU)is disturbed by its own temperature,bias,vibration and other factors,so the pose is easy to diverge when integrating,and the monocular vision positioning accuracy is poor when the robot moves rapidly.Therefore,this paper studies a visual inertial synchronous simultaneous localization and mapping(SLAM)method based on tight coupling.Firstly,the location problem of visualodometry(VO)is studied.In order to reduce the mismatching of feature points,the feature points extraction method based on Oriented FAST and Rotated BRIEF(ORB)is adopted.Then the mathematical model of IMU is constructed,and the discrete integral of the motion model is obtained by using the median method.Finally,the pose of monocular vision is aligned with IMU trajectory,and the optimal state estimation of robot motion is obtained by nonlinear optimization based on sliding window.The two experiments were verified by constructing the simulation scene ard comparing with the monocular ORB-SLAM algorithm.The results show that the proposed method is better than visual odometer alone,and the positioning accuracy is controlled at about 0.4 m,which is 30%higher than the traditional tracking model.
作者 卢佳伟 许哲 LU Jiawei;XU Zhe(College of Engineering,Shanghai Ocean University,Shanghai 201306,China)
出处 《全球定位系统》 CSCD 2021年第1期36-42,共7页 Gnss World of China
关键词 视觉惯性 视觉里程计(VO) 快速特征点提取和描述的算法(ORB)特征点 惯性测量单元(IMU) 非线性优化 visual inertia visual odometer ORB feature points IMU nonlinear optimization
  • 相关文献

参考文献3

二级参考文献23

  • 1刘磊,王永骥,程磊.基于声纳的智能机器人自主定位算法研究[J].华中科技大学学报(自然科学版),2004,32(S1):140-142. 被引量:6
  • 2蔡则苏,洪炳镕,周浦城.基于激光测距传感器的家庭机器人导航仿真[J].哈尔滨工业大学学报,2004,36(7):902-904. 被引量:8
  • 3Drocout C, Delahoche L, Pegard C, et al. Mobile robot localization based on an omnidirectional stereoscopic vision perception system[J]. Robotics and Automation,1999(2) :1322-1328.
  • 4Thrun S. Bayesian landmark learning for mobile robot localization [ J ]. Machine Learning, 1998,33 ( 1 ) :41 -76.
  • 5Harries K D, Michael Recce. Absolute localization for a mobile robot using place cells [ J ]. Robotics and Autonomous Systems, 1997,22(3 ) :393 -406.
  • 6Greg Welch, Gary Bishop. An introduction to the kalman filter[ EB/OL]. [2007 -01 -16]. http://www. cs. unc. edu/ - welch/ media/pdf/kalman_intro. pdf.
  • 7Chang C C, Song K T. Ultrasonic sensor data integration and its application to environment perception [ J ]. Robotic Systems, 1996, 13 (10) :664 -677.
  • 8Jetto L,Longhi S,Venturini G. Development and experimental validation of an adaptive extended kalman filter for the localization of mobile robots [ J ]. IEEE Transactions on Robotics and Automation, 1999,15 ( 2 ) :219 -229.
  • 9Lowe D G.Local feature view clustering for 3D object rec- ognition[C]//Proc of IEEE Conference on Computer Vision and Patten Recognition,Kauai,Hawaii,2001.
  • 10Lowe D G.Distinctive image keypoints[J].The International 2004,60.

共引文献64

同被引文献11

引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部