Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Robot URDF calibration using touch probes and reference geometry. Is it possible? #157

Open
jakub-kaminski opened this issue Jun 2, 2023 · 4 comments
Labels

Comments

@jakub-kaminski
Copy link

jakub-kaminski commented Jun 2, 2023

Hi, interested in calibrating robot URDF (relative frame translations and rotations) + joint offsets, I would like to ask if this package could work with calibration protocols that use a 6-DoF robot with:

If you know use cases or steps to set this up, please share.

I understand that the optimization task in these videos is to fine-tune relative URDF joint transformations and offsets to ensure that in case of datum cube the forward kinematics produces 5 planes that make an accurate cube and have no outliers.
Or, in the second video, that a set of sphere center measurements for each sphere converges to a single center point in the world frame (rather than a collection of points when not calibrated).

If there are other tools in ROS ecosystem that serve this purpose better, please don't hesitate to suggest them.

Thank you for your kind support and consideration.
Jakub

@mikeferguson
Copy link
Owner

I've not really seen this sort of calibration anywhere in the ROS ecosystem.

In theory, robot_calibration could be expanded to do this sort of task (at least the touch probe and cube version) - but it doesn't exist out of the box. We do have error cost functions for things like aligning points to a plane, and so you could model the cube as a series of planes and then you would have a series of points in 3d space from the reprojection of the arm and touch probe - but actually moving the arm to a desired pose and stopping based on the probe would be something new (it's also entirely possible that there would be some unexpected oddities to resolve when constructing so many point-to-plane error blocks with a single point for each).

@radhen
Copy link

radhen commented Jan 17, 2025

Hi @jakub-kaminski and @mikeferguson,

I’m looking to perform robot kinematic calibration to determine offsets in joint positions (XYZ) and orientations (RPY), using ground truth poses from a laser tracker and poses from an uncalibrated robot. The goal is to use the optimized XYZ and RPY values to update the joint locations in the URDF file.

Does this package include an example for this type of calibration? Alternatively, do you know of any other GitHub packages that might support this functionality? I appreciate your time and any guidance you can provide.

@mikeferguson
Copy link
Owner

I think we'll need more information on what a "laser tracker" is?

@radhen
Copy link

radhen commented Jan 20, 2025

@mikeferguson laser trackers comes in different forms. The one we have is actually a 3D scanner from ScanTech. It is essentially a pair of stereo cameras that can track reflective markers in 3D space, giving the 3D location (x,y,z position) of the reflective marker with respect to the camera frame. This serves as the ground truth measurements for the Tool Center Position of the robot arm.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants