-
Notifications
You must be signed in to change notification settings - Fork 116
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Robot URDF calibration using touch probes and reference geometry. Is it possible? #157
Comments
I've not really seen this sort of calibration anywhere in the ROS ecosystem. In theory, robot_calibration could be expanded to do this sort of task (at least the touch probe and cube version) - but it doesn't exist out of the box. We do have error cost functions for things like aligning points to a plane, and so you could model the cube as a series of planes and then you would have a series of points in 3d space from the reprojection of the arm and touch probe - but actually moving the arm to a desired pose and stopping based on the probe would be something new (it's also entirely possible that there would be some unexpected oddities to resolve when constructing so many point-to-plane error blocks with a single point for each). |
Hi @jakub-kaminski and @mikeferguson, I’m looking to perform robot kinematic calibration to determine offsets in joint positions (XYZ) and orientations (RPY), using ground truth poses from a laser tracker and poses from an uncalibrated robot. The goal is to use the optimized XYZ and RPY values to update the joint locations in the URDF file. Does this package include an example for this type of calibration? Alternatively, do you know of any other GitHub packages that might support this functionality? I appreciate your time and any guidance you can provide. |
I think we'll need more information on what a "laser tracker" is? |
@mikeferguson laser trackers comes in different forms. The one we have is actually a 3D scanner from ScanTech. It is essentially a pair of stereo cameras that can track reflective markers in 3D space, giving the 3D location (x,y,z position) of the reflective marker with respect to the camera frame. This serves as the ground truth measurements for the Tool Center Position of the robot arm. |
Hi, interested in calibrating robot URDF (relative frame translations and rotations) + joint offsets, I would like to ask if this package could work with calibration protocols that use a 6-DoF robot with:
If you know use cases or steps to set this up, please share.
I understand that the optimization task in these videos is to fine-tune relative URDF joint transformations and offsets to ensure that in case of datum cube the forward kinematics produces 5 planes that make an accurate cube and have no outliers.
Or, in the second video, that a set of sphere center measurements for each sphere converges to a single center point in the world frame (rather than a collection of points when not calibrated).
If there are other tools in ROS ecosystem that serve this purpose better, please don't hesitate to suggest them.
Thank you for your kind support and consideration.
Jakub
The text was updated successfully, but these errors were encountered: