Rotational Vision System(RVS)is a common active vision system with only rotational degrees of freedom.Usually,the degree of freedom for rotation is provided by the turntable and pan head.Or the hand to eye(EIH)structu...Rotational Vision System(RVS)is a common active vision system with only rotational degrees of freedom.Usually,the degree of freedom for rotation is provided by the turntable and pan head.Or the hand to eye(EIH)structure in articulated arm robots.Due to assembly deviations and manufacturing accuracy limitations,the ideal assumption that the rotation axis is fully aligned with the coordinate axis of the local camera is mostly violated.To address this issue,we propose a generalized deviation model that specifies a rotation axis that connects the rotational motion of the platform with the external orientation(EO)of the camera.On this basis,we propose a heuristic estimation algorithm to minimize global reprojection errors and fit circles in space under constraints of global optimization.The experiment shows that the translation and tilt average reprojection errors of dynamic EO reconstruction based on the reprojection error method are 0.14 and 0.08 pixels,respectively.In the absence of angle measurement,the results of the circle fitting method are similar to them(with a relative error of about 2%),meeting the application requirements of general visual measurement.展开更多
In this paper,a method with an eye-in-hand configuration is developed to hit targets during visual tracking for the TLS(Tele-Light Saber) game.It is not necessary to calibrate camera parameters and predict the traject...In this paper,a method with an eye-in-hand configuration is developed to hit targets during visual tracking for the TLS(Tele-Light Saber) game.It is not necessary to calibrate camera parameters and predict the trajectory of the moving object.Firstly,the expression of the image Jacobian matrix for the eye-in-hand configuration is proposed,and then an update law is designed to estimate the image Jacobian online.Furthermore,a control scheme is presented and the Lyapunov method is employed to prove asymptotic convergence of image errors.No assumption for the moving objects is needed.Finally,both simulation and experimental results are shown to support the approach in this paper.展开更多
Visual servo control rules that refer to the control methods of robot motion planning using image data acquired from the camera mounted on the robot have been widely applied to the motion control of robotic arms or mo...Visual servo control rules that refer to the control methods of robot motion planning using image data acquired from the camera mounted on the robot have been widely applied to the motion control of robotic arms or mobile robots.The methods are usually classified as image-based visual servo,position-based visual servo,and hybrid visual servo(HVS)control rules.Mobile manipulation enhances the working range and flexibility of robotic arms.However,there is little work on applying visual servo control rules to the motion of the whole mobile manipulation robot.We propose an HVS motion control method for a mobile manipulation robot which combines a six-degreeof-freedom(6-DOF)robotic arm with a nonholonomic mobile base.Based on the kinematic differential equations of the mobile manipulation robot,the global Jacobian matrix of the whole robot is derived,and the HVS control equation is derived using the whole Jacobian matrix combined with position and visual image information.The distance between the gripper and target is calculated through the observation of the marker by a camera mounted on the gripper.The differences between the positions of the markers’feature points and the expected positions of them in the image coordinate system are also calculated.These differences are substituted into the control equation to obtain the speed control law of each degree of freedom of the mobile manipulation robot.To avoid the position error caused by observation,we also introduce the Kalman filter to correct the positions and orientations of the end of the manipulator.Finally,the proposed algorithm is validated on a mobile manipulation platform consisting of a Bulldog chassis,a UR5 robotic arm,and a ZED camera.展开更多
基金support of the National Natural Science Foundation of China(No.52175504 and 51927811)the Fundamental Research Funds for the Central Universities of China(No.PA2022GDSK0074)the National Key Research and Development Program of China(No.2022CSJGG1303)
文摘Rotational Vision System(RVS)is a common active vision system with only rotational degrees of freedom.Usually,the degree of freedom for rotation is provided by the turntable and pan head.Or the hand to eye(EIH)structure in articulated arm robots.Due to assembly deviations and manufacturing accuracy limitations,the ideal assumption that the rotation axis is fully aligned with the coordinate axis of the local camera is mostly violated.To address this issue,we propose a generalized deviation model that specifies a rotation axis that connects the rotational motion of the platform with the external orientation(EO)of the camera.On this basis,we propose a heuristic estimation algorithm to minimize global reprojection errors and fit circles in space under constraints of global optimization.The experiment shows that the translation and tilt average reprojection errors of dynamic EO reconstruction based on the reprojection error method are 0.14 and 0.08 pixels,respectively.In the absence of angle measurement,the results of the circle fitting method are similar to them(with a relative error of about 2%),meeting the application requirements of general visual measurement.
基金Supported by the National Natural Science Foundation of China(No.60905061)the National Natural Science Foundation of Tianjin(No.08JCYBJC12700)
文摘In this paper,a method with an eye-in-hand configuration is developed to hit targets during visual tracking for the TLS(Tele-Light Saber) game.It is not necessary to calibrate camera parameters and predict the trajectory of the moving object.Firstly,the expression of the image Jacobian matrix for the eye-in-hand configuration is proposed,and then an update law is designed to estimate the image Jacobian online.Furthermore,a control scheme is presented and the Lyapunov method is employed to prove asymptotic convergence of image errors.No assumption for the moving objects is needed.Finally,both simulation and experimental results are shown to support the approach in this paper.
基金Project supported by the National Natural Science Foundation of China(No.U1609210)Science and Technology Project of Zhejiang Province,China(No.2019C01043)。
文摘Visual servo control rules that refer to the control methods of robot motion planning using image data acquired from the camera mounted on the robot have been widely applied to the motion control of robotic arms or mobile robots.The methods are usually classified as image-based visual servo,position-based visual servo,and hybrid visual servo(HVS)control rules.Mobile manipulation enhances the working range and flexibility of robotic arms.However,there is little work on applying visual servo control rules to the motion of the whole mobile manipulation robot.We propose an HVS motion control method for a mobile manipulation robot which combines a six-degreeof-freedom(6-DOF)robotic arm with a nonholonomic mobile base.Based on the kinematic differential equations of the mobile manipulation robot,the global Jacobian matrix of the whole robot is derived,and the HVS control equation is derived using the whole Jacobian matrix combined with position and visual image information.The distance between the gripper and target is calculated through the observation of the marker by a camera mounted on the gripper.The differences between the positions of the markers’feature points and the expected positions of them in the image coordinate system are also calculated.These differences are substituted into the control equation to obtain the speed control law of each degree of freedom of the mobile manipulation robot.To avoid the position error caused by observation,we also introduce the Kalman filter to correct the positions and orientations of the end of the manipulator.Finally,the proposed algorithm is validated on a mobile manipulation platform consisting of a Bulldog chassis,a UR5 robotic arm,and a ZED camera.