Skip to content

Melvin-Paul-Jacob/Base_Action_Planning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

WS22 - Gesture Recognition

The project aims to identify the gesture shown by humans and tracing an optimal path to grab the object for pickup gesture.

gesture

Table of Contents:

  1. Contributions
  2. Introduction
  3. Architecture
  4. Organisation
  5. Dependencies
  6. Things to consider
  7. Getting started
  8. Acknowledgement

Contributions:

Gesture Recognition

  • Faster and more accurate gesture recognition
  • Introduced invariance to background and lighting conditions
  • Improved accuracy of pointing feature

3D Object Detection

  • Migration from PCL to Open3D
  • Improved detection by enclosing object in a 3D bounding box
  • Removed noise using outlier removal
  • Used clustering to segment out the object
  • Introduced 3D visualization
  • Added Rviz markers

Optimal Grab Pose Estimator

  • Ensures minimum distance from obstacle
  • Ensures that the arm is not over extended
  • Produce a fixed no. of poses around the object for optimal grab action
  • Check for obstacle in the path of the arm before manipulation
  • Picking up object from the closest valid location

Localisation

  • Fixed localisation issue by publishing the map to localiser

Introduction:

The robot identifies the following list of gestures shown by human and perform the respective action.

  • Scan - Perform object detection and generate bounding boxes for the list of COCO Dataset objects in the current robot vision scene
  • Point - Select the object pointed by human
  • Stop - Erase the bounding boxes generated in the robot vision scene

Once the launch file is triggered, the scan gesture is shown by human and the robot generates bounding box for the objects in the scene. The human then points the hand at one of the selected object. The pose conversion from 2D to 3D is done for the selected object. Ten possible target poses for the robot base is generated. The point cloud path validation for all base target poses are performed with respect to the object pose, to remove the invalid base target poses. Then navigation starts with one of the valid target base pose and if the robot is not able to plan the path in 10 seconds, the next valid pose is attempted. Once the robot reaches the target base pose, manipulation is performed to grab the object.

image

Architecture

SDP flow

Organisation:

gesture_detect_classify metapackage contains the following packages

  • test_pipe_ros - Launch file, optimal path planner
    • GestDetClass - Gesture recognition inferencing package
    • detector_yolov7 - Yolo object detection package
    • t2d2t3d - Get 3D pose of the selected object, obstacle detection
    • Load_stream - Navigation, Manipulation, Validation

Dependencies:

  • mdr_knowledge_base
  • mdr_object_recognition
  • mdr_cloud_object_detection
  • mdr_perception_msgs
  • mdr_manipulation_msgs
  • mdr_manipulation
  • mdr_perception
  • mdr_navigation

Things to consider:

  1. You may need to restart services on the robot.

    • hsr_move_arm_action.service
    • hsr_move_arm_joints_action.service
    • hsr_move_base_action.service
    • hsr_move_base_client.service
    • hsr_move_forward_action.service
    • hsr_perceive_plane_action.service
    • hsr_perceive_plane_client.service
    • hsr_pickup_action.service
    • hsr_pickup_client.service
  2. Localize the robot accurately

Getting Started:

  1. pip3 install open3d
  2. pip3 install tensorflow
  3. pip3 install mediapipe
  4. mkdir ~/catkin_ws/src
  5. cd ~/catkin_ws/src
  6. git clone https://github.com/b-it-bots/mas_domestic_robotics.git
  7. git clone https://github.com/Melvin-Paul-Jacob/Base_Action_Planning.git
  8. cd ~/catkin_ws/
  9. catkin build
  10. cd ~/catkin_ws/
  11. python3 Base_Action_Planning/Gesture_Planner/test_pipe_ros.py

Acknowledgements:

  • Thanks to all b-it-bots mas_domestic_robotics contributors

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published