Skip to content

Commit

Permalink
feat(design/autoware-architecture/perception): add radar perception d…
Browse files Browse the repository at this point in the history
…ocument (autowarefoundation#521)

* update document

Signed-off-by: scepter914 <[email protected]>

* apply pre-commit

Signed-off-by: scepter914 <[email protected]>

* fix from supported-function to reference-implementation

Signed-off-by: scepter914 <[email protected]>

* Update docs/design/autoware-architecture/perception/reference-implementations/radar-based-3d-detector/radar-based-3d-detector.md

Co-authored-by: Shunsuke Miura <[email protected]>
Signed-off-by: scepter914 <[email protected]>

* Update docs/design/autoware-architecture/perception/reference-implementations/radar-based-3d-detector/faraway-object-detection.md

Co-authored-by: Shunsuke Miura <[email protected]>
Signed-off-by: scepter914 <[email protected]>

* Update docs/design/autoware-architecture/perception/reference-implementations/radar-based-3d-detector/radar-based-3d-detector.md

Co-authored-by: Shunsuke Miura <[email protected]>
Signed-off-by: scepter914 <[email protected]>

* Update docs/design/autoware-architecture/perception/reference-implementations/radar-based-3d-detector/faraway-object-detection.md

Co-authored-by: Shunsuke Miura <[email protected]>
Signed-off-by: scepter914 <[email protected]>

* Update docs/design/autoware-architecture/perception/reference-implementations/radar-based-3d-detector/faraway-object-detection.md

Co-authored-by: Shunsuke Miura <[email protected]>
Signed-off-by: scepter914 <[email protected]>

* Update docs/design/autoware-architecture/perception/reference-implementations/radar-based-3d-detector/radar-based-3d-detector.md

Co-authored-by: Shunsuke Miura <[email protected]>
Signed-off-by: scepter914 <[email protected]>

---------

Signed-off-by: scepter914 <[email protected]>
Co-authored-by: Shunsuke Miura <[email protected]>
  • Loading branch information
scepter914 and miursh authored Feb 21, 2024
1 parent 95cf4b2 commit 8ea5138
Show file tree
Hide file tree
Showing 5 changed files with 751 additions and 0 deletions.
1 change: 1 addition & 0 deletions docs/design/autoware-architecture/perception/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -103,6 +103,7 @@ As mentioned in the goal session, this perception module is designed to be exten
| Camera DNN based 2D detector | This module takes camera images as input and detects objects such as vehicles, trucks, buses, pedestrians, and bicycles in the two-dimensional image space. It detects objects within image coordinates and providing 3D coordinate information is not mandatory. | - Camera Images |
| LiDAR Clustering | This module performs clustering of point clouds and shape estimation to achieve object detection without labels. | - Point Clouds |
| Semi-rule based detector | This module detects objects using information from both images and point clouds, and it consists of two components: LiDAR Clustering and Camera DNN based 2D detector. | - Output from Camera DNN based 2D detector and LiDAR Clustering |
| Radar based 3D detector | This module takes radar data as input and detects dynamic 3D objects. In detail, please see [this document](reference-implementations/radar-based-3d-detector/radar-based-3d-detector.md). | - Radar data |
| Object Merger | This module integrates results from various detectors. | - Detected Objects |
| Interpolator | This module stabilizes the object detection results by maintaining long-term detection results using Tracking results. | - Detected Objects <br> - Tracked Objects |
| Tracking | This module gives ID and estimate velocity to the detection results. | - Detected Objects |
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,64 @@
# Radar faraway dynamic objects detection with radar objects

## Overview

This diagram describes the pipeline for radar faraway dynamic object detection.

![faraway object detection](image/faraway-object-detection.drawio.svg)

## Reference implementation

### Crossing filter

- [radar_crossing_objects_noise_filter](https://github.com/autowarefoundation/autoware.universe/tree/main/perception/radar_crossing_objects_noise_filter)

This package can filter the noise objects crossing to the ego vehicle, which are most likely ghost objects.

### Velocity filter

- [object_velocity_splitter](https://github.com/autowarefoundation/autoware.universe/tree/main/perception/object_velocity_splitter)

Static objects include many noise like the objects reflected from ground.
In many cases for radars, dynamic objects can be detected stably.
To filter out static objects, `object_velocity_splitter` can be used.

### Range filter

- [object_range_splitter](https://github.com/autowarefoundation/autoware.universe/tree/main/perception/object_range_splitter)

For some radars, ghost objects sometimes occur for near objects.
To filter these objects, `object_range_splitter` can be used.

### Vector map filter

- [object-lanelet-filter](https://github.com/autowarefoundation/autoware.universe/blob/main/perception/detected_object_validation/object-lanelet-filter.md)

In most cases, vehicles drive in drivable are.
To filter objects that are out of drivable area, `object-lanelet-filter` can be used.
`object-lanelet-filter` filter objects that are out of drivable area defined by vector map.

Note that if you use `object-lanelet-filter` for radar faraway detection, you need to define drivable area in a vector map other than the area where autonomous car run.

### Radar object clustering

- [radar_object_clustering](https://github.com/autowarefoundation/autoware.universe/tree/main/perception/radar_object_clustering)

This package can combine multiple radar detections from one object into one and adjust class and size.
It can suppress splitting objects in tracking module.

![radar_object_clustering](https://raw.githubusercontent.com/autowarefoundation/autoware.universe/main/perception/radar_object_clustering/docs/radar_clustering.drawio.svg)

## Note

### Parameter tuning

Detection performed only by Radar applies various strong noise processing.
Therefore, there is a trade-off that if you strengthen the noise processing, things that you originally wanted to detect will disappear, and if you weaken it, your self-driving system will be unable to start because the object will be in front of you all the time due to noise.
It is necessary to adjust parameters while paying attention to this trade-off.

### Limitation

- Elevated railway, vehicles for multi-level intersection

If you use 2D radars (The radar can detect in xy-axis 2D coordinate, but can not have z-axis detection) and driving area has elevated railway or vehicles for multi-level intersection, the radar process detects these these and these have a bad influence to planning results.
In addition, for now, elevated railway is detected as vehicle because the radar process doesn't have label classification feature and it leads to unintended behavior.
Loading

0 comments on commit 8ea5138

Please sign in to comment.