摘要
针对协作型机器人,自主研发了一款柔性阵列式触觉传感器,将其封装为可感知人手抓握姿态与力大小的触觉手柄,提出一种基于卷积神经网络(CNN)的可区分人手松抓握、紧抓握与无意间触碰触觉手柄3种模式状态的方法,识别准确率达到98.2%。提出一种可变导纳控制策略,利用人手抓握手柄状态,实时调节机械臂虚拟阻尼,基于此触觉手柄可实时感知人手局部变换姿态,准确估计操作者操作意图,并将局部感知信息传输给机器人控制其运动,以UR协作型机器人为实验平台,以触觉手柄为感知输入并进行人机交互实验,对机械臂运动精度做了评价。实验表明触觉手柄具有良好的意图感知能力。
Aiming at cooperative robot, a flexible array type tactile sensor is independently developed and packaged into a tactile handle that can sense the grip posture and force of human hand. A convolutional neural network(CNN)-based method is proposed, which can distinguish the three modes of loose griping, tight griping and inadvertently touching the tactile handle. The recognition accuracy reaches 98.2%. A variable admittance control strategy is proposed to adjust the virtual damping of the manipulator in real time according to the state with which human hand grasps the handle. Based on this tactile handle, the local posture change of human hand can be sensed in real time, the operator operation intention can be accurately estimated, and the local perception information is transmitted to the robot to control its motion. Taking the UR collaborative robot as the experiment platform, using the tactile handle as the perceptual input, the human-computer interaction experiment was conducted, and the motion accuracy of the manipulator was evaluated. Experiment results show that the tactile handle has good intentional perception capability.
作者
李铁军
刘应心
刘今越
杨冬
Li Tiejun;Liu Yingxin;Liu Jinyue;Yang Dong(School of Mechanical Engineering,Hebei University of Technology,Tianjin 300132,China)
出处
《仪器仪表学报》
EI
CAS
CSCD
北大核心
2020年第1期100-112,共13页
Chinese Journal of Scientific Instrument
基金
国家自然科学基金(U1813222)
国家重点研发计划(2018YFB1306902,2017YFB1301002)项目资助.
关键词
人机协作
触觉感知
人手抓握识别
意图理解
human-robot cooperation
tactile perception
human hand grip recognition
intentional understanding