The physical experiments of GKNet is composed of four different grasping experiments: (a) static grasping, (b) grasping at varied camera viewpoints, (c) dynamic grasping and (d) bin picking. The design purpose of each experiment has been clarified in the manuscript. Please refer to Section 7 for details if interested.
This readme file documents how to run all physical experiments presented in the manuscript, which mainly consists of Python and ROS codes. All experiments in the manuscript were conducted with Kinect Xbox 360 but considering the potential bugs happened with old drivers, codes with camera Realsense is provided here.
We are supporting two types of cameras: Kinect Xbox 360 and Realsense D435. Personally I will recommend Realsense since Kinect is pretty old and its driver isn't very stable. To use Realsense D435, you just need to follow the installation instruction in the official website. To use Kinect Xbox 360, since ROS used Python 2.7 as default python library, to run the physical experiment, you will need to create another anaconda environment with python 2.7. Considering Pytorch dropped their support of newer versions for Python 2.7, you might need to install pytorch with a version that can be found and fit your Cuda version. If not, you might consider install another cuda with older version. For installing multiple Cuda, you can refer to this tutorial.
This script will run GKNet to provide grasp detections via camera Realsense/Kinect. The grasp detection results will be published on the ROS topic for ROS-side scripts to subscribe.
python scripts/static_grasp_kt.py dbmctdet_cornell --exp_id static_grasp --arch dla_34 --dataset cornell --fix_res --load_model ../models/model_dla34_cornell.pth --ae_threshold 0.6 --ori_threshold 0.24 --center_threshold 0.10 --scores_threshold 0.15 --center_weight 1.0
This script will run GKNet to provide grasp detections via camera Realsense/Kinect. The grasp detection results will be published on the ROS topic for ROS-side scripts to subscribe.
python scripts/static_grasp_kt.py dbmctdet_cornell --exp_id grasp_varied_angle --arch dla_34 --dataset cornell --fix_res --load_model ../models/model_dla34_cornell.pth --ae_threshold 0.6 --ori_threshold 0.24 --center_threshold 0.10 --scores_threshold 0.15 --center_weight 1.0
This script will run GKNet to provide contincontinuousous grasp detection via camera Realsense/Kinect. The grasp detection results will be published on the ROS topic for ROS-side scripts to subscribe.
python scripts/dynamic_grasp_kt.py dbmctdet_cornell --exp_id dynamic_grasp --arch dla_34 --dataset cornell --fix_res --load_model ../models/model_dla34_cornell.pth --ae_threshold 0.6 --ori_threshold 0.24 --center_threshold 0.10 --scores_threshold 0.15 --center_weight 1.0
This script will run GKNet to provide grasp detections via camera Realsense. The grasp detection results will be published on the ROS topic for ROS-side scripts to subscribe. Additionally, this code will check if the pick bin is clean to determine if it is time to end the task. The result will also be published through ROS topic.
python scripts/bin_picking_kt.py dbmctdet --exp_id bin_picking --arch dlanonlocal_34 --dataset jac_coco_36 --load_model ../models/model_dla34_ajd.pth --ae_threshold 0.65 --ori_threshold 0.1745 --center_threshold 0.15 --scores_threshold 0.15
- Install ROS.
- Install MoveIt!.
- Install camera driver for Kinect or Realsense.
- Download ivaHandy and compile it under your ROS workspace for experiment codebase.
- Download handy_experiment package and compile it under your ROS workspace for experiment codebase.
- Run all launch files for setup step by step.
cd handy_ws
roslaunch finalarm_cotrol controller_manager.launch
roslaunch finalarm_control start_controller.launch
roslaunch finalarm_description robot_state_pub.launch
roslaunch finalarm_moveit_config move_group.launch
roslaunch finalarm_moveit_config moveit_rviz.launch
You might meet the error after trying to launch the controller for each motor. To fix the error, type
sudo chmod 666 /dev/ttyUSB0
- Run camera driver
roslaunch openni_launch openni.launch depth_registration:=true
- Run the corresponding script for each experiment.
roslaunch handy_experiment static_grasp.launch
roslaunch handy_expeirment dynamic_grasp.launch
roslaunch handy_experiment bin_picking.launch
Note:
- static grasping and grasping at varied camera angles experiments share the same source code.
- Running dynamic grasping experiment requires running dbrt for estimating the gripper's pose. All codes here stored in here. Please follow the instructions to setup everything before launch dynamic_grasp.launch.