This is a simulation tool for handeye_calibration_ros2 which you can try out offline without any hardware.
Follow these instructions to install a virtual camera in ROS2 Gazebo, place an ArUco marker, and use MoveIt or custom joint-/Cartesian-space control code to move the robot and collect samples for the hand-eye calibration process. This setup allows you to analyze hand-eye calibration errors across different robot poses and sample points, as the simulation environment provides known ground truth data.
By replicating the same robot and marker poses in real-world experiments, you may reduce calibration errors.
To get yourself started, please prepare
- A ROS 2 driver for the robot of your choice.
- Add camera description files into your ROS 2 driver.
- Add an ArUco marker with proper side length corresponding to the configuration file, or modify it to match your length
Here we provide an example usage with KUKA LBR robot. We used the lbr_fri_ros2_stack.
Download the camera_descriptions_example
in this repo, and replace the description files under lbr-stack/src/lbr_fri_ros2_stack/lbr_description/urdf/iiwa7
Please refer to the newest lbr-stack instruction here
cd lbr-stack
colcon build
source install/setup.bash
ros2 launch lbr_bringup bringup.launch.py moveit:=true sim:=true
In Gazebo, you should be able to see the camera with image capture:
In rviz, add the image topic and choose /camera/image_raw
, you should be able to see the camera image:
Open a new terminal, go to your handeye calibration workspace
cd handeye_calibration_ws
colcon build --packages-select handeye_sim
source install/setup.bash
ros2 launch handeye_sim taking_sample_launch.py
You should be able to see a new camera window popping up:
Press key q
to record both robot and sample pose, and press key e
to exit. You should be able to see the marker image, robot_data_simulation.yaml, marker_data_simulation.yaml being saved under handeye_sim/resource
When you take enough distinctive samples, you can compute the handeye calibration and publish the visualization in rviz.
To compute the hand-eye calibration result:
ros2 run handeye_sim handeye
To publish the visualization in rviz:
ros2 run handeye_sim eye2hand