摘要
针对当前抓取检测模型对密集遮挡物体的检测效果差以及人工数据标注工作量大的问题,提出基于RGB-D图像融合的目标检测与抓取检测分步骤进行的改进方案.新方案支持将单物体图像训练的抓取检测模型直接应用于密集遮挡的多物体图像场景中.首先,考虑到密集遮挡场景下抓取物具有多尺度的特点,提出子阶段路径聚合(SPA)的多尺度特征融合模块,用于丰富RGB-D特征级别融合的目标检测模型SPA-YOLO-Fusion的高维语义特征信息,以便于检测模型定位所有的抓取物;其次,使用基于RGB-D像素级别融合的GR-ConvNet抓取检测模型估计每个物体的抓取点,并提出背景填充的图像预处理算法来降低密集遮挡物体的相互影响;最后,使用机械臂对目标点进行抓取.在LineMOD数据集上对目标检测模型进行测试,实验结果表明SPA-YOLO-Fusion的mAP比YOLOv3-tiny与YOLOv4-tiny分别提高了10%与7%.从实际场景中采集图像制作YODO_Grasp抓取检测数据集并进行测试,结果表明增加背景填充预处理算法的GR-ConvNet的抓取检测精度比原模型提高了23%.
Current grasp detection algorithms suffer from the poor accuracy and time-consuming or expensive data annotation in densely occluded scenes.To address this concern,a step-by-step improved solution for object detection and grasp detection based on RGB-D fusion is proposed.The grasp detection model trained on single-object can be directly applied to densely occluded multi-object scenes.Firstly,considering the multi-scale characteristics of objects in densely occluded scenes,the sub-stage and path aggregation(SPA)multi-scale feature fusion module is proposed to enrich the high dimensional feature characterization of middle fusion detector SPA-YOLO-Fusion,so as to locate all objects.Then the GR-ConvNet equipped with RGB-D pixel-level fusion outputs the optimal grasp points of all detected objects.At the same time,the background padding preprocessing algorithm is proposed to reduce the interference of other objects in the GR-ConvNet.The mAP of SPA-YOLO-Fusion is 10% and 7% higher than that of YOLOv3-tiny and YOLOv4-tiny on the LineMOD dataset,respectively.The grasp detection accuracy of the GR-ConvNet equipped with the padding algorithm is improved by 23% compared with the original model on the YODO_Grasp dataset,which is collected from the actual scene.
作者
李明
鹿朋
朱龙
朱美强
邹亮
LI Ming;LU Peng;ZHU Long;ZHU Mei-qiang;ZOU Liang(Engineering Research Center of Intelligent Control for Underground Space Ministry of Education,Xuzhou 221116,China;School of Information and Control Engineering,China University of Mining and Technology,Xuzhou 221116,China)
出处
《控制与决策》
EI
CSCD
北大核心
2023年第10期2867-2874,共8页
Control and Decision
基金
国家自然科学基金项目(51904297,61901003)。