-
-
Notifications
You must be signed in to change notification settings - Fork 2
Cost Map
Context: URC rules indicate a significant obstacle will obstruct the water bottle during the auton mission.
The second object will be a standard 1 L wide-mouthed plastic water bottle of unspecified color/markings (approximately 21.5 cm tall by 9 cm diameter). The second object may have obstacles in the way that require autonomous avoidance.
The last post may have obstacles in the way that require autonomous avoidance, such as being in a boulder field
Problem: When navigating during the auton mission, the rover may encounter rocky terrain which could cause the rover to break or get stuck. As a result, it is perception's job to detect where these obstacles are and relay that information over to the navigation system.
Solution: Perception will construct a local costmap using the ZED's point cloud surface normals. The cost map will be centered around a requested waypoint, and in have a grid indicating how difficult it would be to traverse the corresponding terrain.
Interface (subject to change)
Node: costmap
Subscribes: sensor_msgs/PointCloud2 (ZED point cloud)
Publishes: nav_msgs/OccupancyGrid (Global costmap)
Completed Steps:
- Create subscriber for the point cloud topic
- Process Point Cloud for extraneous points
- Bin each of the points into their respective grid locations
- Use binned points to compute cost*
- Fill global costmap in using local costmap
Future Steps:
- Create height based costmap in sim
- Test height based costmap on zed
- Create topological costmap in sim
- Test topological costmap on zed
The main goal of these steps are to create an accurate costmap. Previous strategies have led to hallucinated objects. As a result, this project may require many iterations for different cost map types, alongside IRL testing!!.
Points inside of point cloud will be initially relative to the zed's camera frame. However, the occupancy grid will be expressed in global frame with reference to a given waypoint. To transform the points in the ZED camera frame to the occupancy grid use the SE3Conversions::fromTfTree(TfBuffer, "<camera_frame>", "<map frame>");
function.
Furthermore, using a Exponentially Weighted Moving Average (EWMA) filter has been found to help reduce noise in final output.
Ask an auton lead if you have any questions about this.