期刊文献+

基于极线及共面约束条件的Kinect点云配准方法 被引量:3

Kinect Point Cloud Registration Method Based on Epipolar and Point-to-Plane Constraints
原文传递
导出
摘要 Kinect作为轻量级手持传感器,在室内场景恢复与模型重建中具有灵活、高效的特点。不同于大多数只基于彩色影像或只基于深度影像的重建算法,提出一种将彩色影像与深度影像相结合的点云配准算法并用于室内模型重建恢复,其过程包括相邻帧数据的配准与整体优化。在Kinect已被精确标定的基础上,将彩色影像匹配得到的同名点构成极线约束与深度图像迭代最近点配准的点到面约束相结合,以提高相邻帧数据配准算法的精度与鲁棒性。利用相邻4帧数据连续点共面约束,对相邻帧数据配准结果进行全局优化,以提高模型重建的精度。在理论分析基础上,通过实验验证了该算法在Kinect Fusion无法实现追踪、建模的场景中鲁棒性依然较好,点云配准及建模精度符合Kinect观测精度。 Most model reconstruction and indoor scene recovery methods with Kinect, involve either depth or color images, and merely combine them. These approaches do not make full use of Kinect data, and are not robust and accurate enough in many use cases. In order to solve this problem, this paper proposes a new method in which Kinect is calibrated and epipolar constraints from matching the sequence of color images are combined with point-to-plane constraints in ICP registration to improve the accuracy and robustness. As ICP registers clouds frame by frame, the error will inevitably accumulate. So a four-points coplanar method is applied to optimize the Kinect position and orientation , and make the reconstructed model more pre- cise. Model and indoor scene experiments were designed to demonstrate that the proposed method is effective. Results show that it is more robust, even in a scene that KinectFusion fails at tracking and modeling. The registration accuracy of point clouds accords with Kinect observation accuracy.
作者 叶勤 姚亚会 桂坡坡 YE Qin YAO Yahui GUI Popo(College of Surveying and Geo-informatics, Tongji University, Shanghai 200092, China)
出处 《武汉大学学报(信息科学版)》 EI CSCD 北大核心 2017年第9期1271-1277,共7页 Geomatics and Information Science of Wuhan University
基金 上海市自然科学基金(13ZR1444300)~~
关键词 KINECT 极线约束 点面约束 四点共面 Kinect epipolar constraints point-to-plane constraints four-points coplanar
  • 相关文献

参考文献1

二级参考文献16

  • 1Ganganath N, Leung H. Mobile robot localization using odometry and kinect sensoriAl. IEEE Signal Processing Society. Proceedings of 2012 IEEE International Confer- ence on Emerging Signal Processing Applications (ES- PA) ,Las Vegas, NV: IEEE[C]. 2012,91-94.
  • 2Du G, Zhang P, Mai J, et al. Markerless Kinect-based hand tracking for robot teleoperation [J]. International Journal of Advanced Robotic Systems, 2012,9 ( 36 ) : 1- 10.
  • 3Tamura Y,Takabatake Y, Kashima N,et al. Localization system using microsoft Kinect for indoor structures [J], Plasma and Fusion Research,2012,7(0):2406036.
  • 4Crawford S. How microsoft Kinect works [J]. Howstuff- works[http ://electronics. howstuffworks, com/microsoft- kinect, htm], Retrieved from the Internet on Aug, 2010, 19 : 1-5.
  • 5Smisek J,Jancosek M, Pajdla T. 3D with Kinect[A]. An- :Jrea Fossati, Juergen Gall, Helmut Grabner, et al. Con- sumer Depth Cameras for Computer Vision[C]. London: Springer,2013,3-25.
  • 6Richard A. Newcombe, Shahram Izadi, Otmar Hilliges, et al. KinectFusion: Real-time dense surface mapping and tracking[A]. Billinghurst, Mark. Proceedings of 2011 10thIEEE international symposium on Mixed and augmented reality (ISMAR) [C]. BaseI:IEEE, 2011, 127-136.
  • 7Li B Y L,Mian A S,Liu W,et al. Using kinect for face rec- ognition under varying poses, expressions, illumination and disguise[A]. 2013 IEEE Workshop on WACV. Pro- ceedings of 2013 IEEE Workshop on Applications of Com- puter Vision (WACV) [C]. Tempa, Florida: IEEE, 2013, 186-192.
  • 8Henry P,Krainin M,Herbst E,et al. RGB-D mapping: U- sing Kinect-style depth cameras for dense 3D modeling of indoor environments[J]. The International Journal of Ro- botics Research, 2012,31 (5) : 647-663.
  • 9Remondino F,Fraser C. Digital camera calibration meth- ods : considerations and comparisons[J]. International Ar- chives of Photogrammetry, Remote Sensing and Spatial Information Sciences, 2006,36 (5) : 266-272.
  • 10Zhang Z. A flexible new technique for camera calibration [J]. Pattern Analysis and Machine Intelligence, IEEE Transactions on, 2000,22(11) : 1330-1334.

共引文献4

同被引文献31

引证文献3

二级引证文献46

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部