期刊文献+

基于实例分割与光流的动态场景RGB-D SLAM

RGB-D SLAM method of dynamic scene based on instance segmentation and optical flow
下载PDF
导出
摘要 为了提高动态场景RGB-D SLAM中相机位姿精度,基于实例分割与光流算法,提出一种高精度RGB-D SLAM方法。首先,通过实例分割算法检测出场景中的物体,删除非刚性物体并构造语义地图。接着,通过光流信息计算运动残差,检测场景中动态刚性物体,并在语义地图中追踪这些动态刚性物体。然后,删除每一帧中非刚性物体和动态刚性物体上的动态特征点,利用其他稳定的特征点优化相机位姿。最后,通过TSDF模型重建静态背景,并以点云的形式显示动态刚性物体。在TUM和Bonn数据集中测试表明,本文方法与当前最先进的SLAM工作ACEFusion相比相机精度提升约43%。消融实验结果表明,保留动态刚性物体处于静止状态下的特征点对相机位姿估计结果提升约37%。稠密建图实验结果表明,本文方法在动态场景中重建结果优于当前先进的工作,平均重建误差为0.042 m。代码开源在https://github.com/wawcg/dy_wcg。 A new method for improving the accuracy of camera pose estimation in RGB-D SLAM of dy⁃namic scenes was proposed.This method was based on instance segmentation and optical flow.The first step was to detect objects in the scene using instance segmentation,eliminate non-rigid objects,and con⁃struct a semantic map.The second step involved calculating motion residuals through optical flow informa⁃tion,detecting dynamic rigid objects,and tracking them in the semantic map.Next,dynamic feature points on non-rigid objects and dynamic rigid objects in each frame were removed,and the camera pose was optimized using stable feature points.Finally,the static background was reconstructed using the TS⁃DF model,and the dynamic rigid objects were displayed as point clouds.Tests conducted on the TUM and Bonn datasets demonstrate that Compared with the most advanced work ACEFusion,the method proposed in this article improves camera accuracy by approximately 43%.The results show that retaining feature points of dynamic rigid objects in a static state can significantly improve camera pose estimation results.The dense map⁃ping experiments show that our method outperforms better in dynamic 3D reconstruction,the average recon⁃struction error is 0.042 m.Our code is available athttps://github.com/wawcg/dy_wcg.
作者 王成根 史金龙 诸皓伟 白素琴 孙蕴翰 卢加文 黄树成 WANG Chenggen;SHI Jinlong;ZHU Haowei;BAI Suqin;SUN Yunhan;LU Jiawen;HUANG Shucheng(School of Computer Science and Engineering,Jiangsu University of Science and Technology,Zhenjiang 212000,China;State Key Laboratory for Novel Software Technology,Nanjing University,Nanjing 210046,China)
出处 《光学精密工程》 EI CAS CSCD 北大核心 2024年第6期857-867,共11页 Optics and Precision Engineering
基金 国家自然基金资助项目(No.62276118,No.61772244) 中国民航大学民航智慧机场理论与重点实验室开放基金资助(No.SATS202207)。
关键词 动态场景 同步定位与地图构建 实例分割 光流 dynamic scenes SLAM instance segmentation optical flow
  • 相关文献

参考文献2

二级参考文献6

共引文献29

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部