Skip to content

Commit

Permalink
Deployed bed42ca to latest with MkDocs 1.4.3 and mike 2.2.0.dev0
Browse files Browse the repository at this point in the history
  • Loading branch information
github-actions committed Nov 20, 2024
1 parent 1c52a73 commit 46fb166
Show file tree
Hide file tree
Showing 14 changed files with 25 additions and 25 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -7869,7 +7869,7 @@ <h2 id="json-schema">JSON Schema<a class="headerlink" href="#json-schema" title=
<span class="p">}</span>
</code></pre></div>
<p>The schema file path is <code>INSERT_PATH_TO_PACKAGE/schema/</code> and the schema file name is <code>INSERT_NODE_NAME.schema.json</code>. To adapt the template to the <abbr title="Robot Operating System">ROS</abbr> node, replace each <code>INSERT_...</code> and add all parameters <code>1..N</code>.</p>
<p>See example: <em>Lidar Apollo Segmentation TVM Nodes</em> <a href="https://github.com/autowarefoundation/autoware.universe/blob/main/perception/lidar_apollo_segmentation_tvm_nodes/schema/lidar_apollo_segmentation_tvm_nodes.schema.json">schema</a></p>
<p>See example: <em>Image Projection Based Fusion - Pointpainting</em> <a href="https://github.com/autowarefoundation/autoware.universe/blob/main/universe/perception/autoware_image_projection_based_fusion/schema/pointpainting.schema.json">schema</a></p>
<h3 id="attributes">Attributes<a class="headerlink" href="#attributes" title="Permanent link">#</a></h3>
<p>Parameters have several attributes, some are required and some optional. The optional attributes are highly encouraged when applicable, as they provide useful information about a parameter and can ensure the value of the parameter is within its bounds.</p>
<h4 id="required">Required<a class="headerlink" href="#required" title="Permanent link">#</a></h4>
Expand Down
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Original file line number Diff line number Diff line change
Expand Up @@ -7775,37 +7775,37 @@ <h2 id="overview">Overview<a class="headerlink" href="#overview" title="Permanen
<h2 id="reference-implementation">Reference implementation<a class="headerlink" href="#reference-implementation" title="Permanent link">#</a></h2>
<h3 id="crossing-filter">Crossing filter<a class="headerlink" href="#crossing-filter" title="Permanent link">#</a></h3>
<ul>
<li><a href="https://github.com/autowarefoundation/autoware.universe/tree/main/perception/radar_crossing_objects_noise_filter">radar_crossing_objects_noise_filter</a></li>
<li><a href="https://github.com/autowarefoundation/autoware.universe/tree/main/perception/autoware_radar_crossing_objects_noise_filter">radar_crossing_objects_noise_filter</a></li>
</ul>
<p>This package can filter the noise objects crossing to the ego vehicle, which are most likely ghost objects.</p>
<h3 id="velocity-filter">Velocity filter<a class="headerlink" href="#velocity-filter" title="Permanent link">#</a></h3>
<ul>
<li><a href="https://github.com/autowarefoundation/autoware.universe/tree/main/perception/object_velocity_splitter">object_velocity_splitter</a></li>
<li><a href="https://github.com/autowarefoundation/autoware.universe/tree/main/perception/autoware_object_velocity_splitter">object_velocity_splitter</a></li>
</ul>
<p>Static objects include many noise like the objects reflected from ground.
In many cases for radars, dynamic objects can be detected stably.
To filter out static objects, <code>object_velocity_splitter</code> can be used.</p>
<h3 id="range-filter">Range filter<a class="headerlink" href="#range-filter" title="Permanent link">#</a></h3>
<ul>
<li><a href="https://github.com/autowarefoundation/autoware.universe/tree/main/perception/object_range_splitter">object_range_splitter</a></li>
<li><a href="https://github.com/autowarefoundation/autoware.universe/tree/main/perception/autoware_object_range_splitter">object_range_splitter</a></li>
</ul>
<p>For some radars, ghost objects sometimes occur for near objects.
To filter these objects, <code>object_range_splitter</code> can be used.</p>
<h3 id="vector-map-filter">Vector map filter<a class="headerlink" href="#vector-map-filter" title="Permanent link">#</a></h3>
<ul>
<li><a href="https://github.com/autowarefoundation/autoware.universe/blob/main/perception/detected_object_validation/object-lanelet-filter.md">object-lanelet-filter</a></li>
<li><a href="https://github.com/autowarefoundation/autoware.universe/blob/main/perception/autoware_detected_object_validation/object-lanelet-filter.md">object-lanelet-filter</a></li>
</ul>
<p>In most cases, vehicles drive in drivable are.
To filter objects that are out of drivable area, <code>object-lanelet-filter</code> can be used.
<code>object-lanelet-filter</code> filter objects that are out of drivable area defined by vector map.</p>
<p>Note that if you use <code>object-lanelet-filter</code> for radar faraway detection, you need to define drivable area in a vector map other than the area where autonomous car run.</p>
<h3 id="radar-object-clustering">Radar object clustering<a class="headerlink" href="#radar-object-clustering" title="Permanent link">#</a></h3>
<ul>
<li><a href="https://github.com/autowarefoundation/autoware.universe/tree/main/perception/radar_object_clustering">radar_object_clustering</a></li>
<li><a href="https://github.com/autowarefoundation/autoware.universe/tree/main/perception/autoware_radar_object_clustering">radar_object_clustering</a></li>
</ul>
<p>This package can combine multiple radar detections from one object into one and adjust class and size.
It can suppress splitting objects in tracking module.</p>
<p><img alt="radar_object_clustering" src="https://raw.githubusercontent.com/autowarefoundation/autoware.universe/main/perception/radar_object_clustering/docs/radar_clustering.drawio.svg"></p>
<p><img alt="radar_object_clustering" src="https://raw.githubusercontent.com/autowarefoundation/autoware.universe/main/perception/autoware_radar_object_clustering/docs/radar_clustering.drawio.svg"></p>
<h2 id="note">Note<a class="headerlink" href="#note" title="Permanent link">#</a></h2>
<h3 id="parameter-tuning">Parameter tuning<a class="headerlink" href="#parameter-tuning" title="Permanent link">#</a></h3>
<p>Detection performed only by Radar applies various strong noise processing.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -7863,7 +7863,7 @@ <h3 id="noise-filter-and-radar-faraway-dynamic-3d-object-detection">Noise filter
In detail, please see <a href="../faraway-object-detection/">this document</a></p>
<h3 id="radar-fusion-to-lidar-based-3d-object-detection">Radar fusion to LiDAR-based 3D object detection<a class="headerlink" href="#radar-fusion-to-lidar-based-3d-object-detection" title="Permanent link">#</a></h3>
<ul>
<li><a href="https://github.com/autowarefoundation/autoware.universe/tree/main/perception/radar_fusion_to_detected_object">radar_fusion_to_detected_object</a></li>
<li><a href="https://github.com/autowarefoundation/autoware.universe/tree/main/perception/autoware_radar_fusion_to_detected_object">radar_fusion_to_detected_object</a></li>
</ul>
<p>This package contains a sensor fusion module for radar-detected objects and 3D detected objects. The fusion node can:</p>
<ul>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -7845,19 +7845,19 @@ <h3 id="noise-filter">Noise filter<a class="headerlink" href="#noise-filter" tit
<p>Radar can detect x-axis velocity as doppler velocity, but cannot detect y-axis velocity. Some radar can estimate y-axis velocity inside the device, but it sometimes lack precision. This package treats these objects as noise by y-axis threshold filter.</p>
<h3 id="message-converter">Message converter<a class="headerlink" href="#message-converter" title="Permanent link">#</a></h3>
<ul>
<li><a href="https://github.com/autowarefoundation/autoware.universe/tree/main/perception/radar_tracks_msgs_converter">radar_tracks_msgs_converter</a></li>
<li><a href="https://github.com/autowarefoundation/autoware.universe/tree/main/perception/autoware_radar_tracks_msgs_converter">radar_tracks_msgs_converter</a></li>
</ul>
<p>This package converts from <code>radar_msgs/msg/RadarTracks</code> into <code>autoware_auto_perception_msgs/msg/DetectedObject</code> with ego vehicle motion compensation and coordinate transform.</p>
<h3 id="object-merger">Object merger<a class="headerlink" href="#object-merger" title="Permanent link">#</a></h3>
<ul>
<li><a href="https://github.com/autowarefoundation/autoware.universe/tree/main/perception/object_merger">object_merger</a></li>
<li><a href="https://github.com/autowarefoundation/autoware.universe/tree/main/perception/autoware_object_merger">object_merger</a></li>
</ul>
<p>This package can merge 2 topics of <code>autoware_auto_perception_msgs/msg/DetectedObject</code>.</p>
<ul>
<li><a href="https://github.com/autowarefoundation/autoware.universe/tree/main/perception/simple_object_merger">simple_object_merger</a></li>
<li><a href="https://github.com/autowarefoundation/autoware.universe/tree/main/perception/autoware_simple_object_merger">simple_object_merger</a></li>
</ul>
<p>This package can merge simply multiple topics of <code>autoware_auto_perception_msgs/msg/DetectedObject</code>.
Different from <a href="https://github.com/autowarefoundation/autoware.universe/tree/main/perception/object_merger">object_merger</a>, this package doesn't use association algorithm and can merge with low calculation cost.</p>
Different from <a href="https://github.com/autowarefoundation/autoware.universe/tree/main/perception/autoware_object_merger">object_merger</a>, this package doesn't use association algorithm and can merge with low calculation cost.</p>
<ul>
<li><a href="https://github.com/ros-tooling/topic_tools">topic_tools</a></li>
</ul>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -7884,7 +7884,7 @@ <h3 id="message-converter-from-radarscan-to-pointcloud2">Message converter from
<p>For considered use cases,</p>
<ul>
<li>Use <a href="https://github.com/autowarefoundation/autoware.universe/tree/main/sensing/pointcloud_preprocessor">pointcloud_preprocessor</a> for radar scan.</li>
<li>Apply obstacle segmentation like <a href="https://github.com/autowarefoundation/autoware.universe/tree/main/perception/ground_segmentation">ground segmentation</a> to radar points for LiDAR-less (camera + radar) systems.</li>
<li>Apply obstacle segmentation like <a href="https://github.com/autowarefoundation/autoware.universe/tree/main/perception/autoware_ground_segmentation">ground segmentation</a> to radar points for LiDAR-less (camera + radar) systems.</li>
</ul>
<h2 id="appendix">Appendix<a class="headerlink" href="#appendix" title="Permanent link">#</a></h2>
<h3 id="discussion">Discussion<a class="headerlink" href="#discussion" title="Permanent link">#</a></h3>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -7819,7 +7819,7 @@ <h3 id="message-usage-for-radartracks">Message usage for RadarTracks<a class="he
uint16<span class="w"> </span><span class="nv">BICYCLE</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="m">32006</span><span class="p">;</span>
uint16<span class="w"> </span><span class="nv">PEDESTRIAN</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="m">32007</span><span class="p">;</span>
</code></pre></div>
<p>For detail implementation, please see <a href="https://github.com/autowarefoundation/autoware.universe/tree/main/perception/radar_tracks_msgs_converter">radar_tracks_msgs_converter</a>.</p>
<p>For detail implementation, please see <a href="https://github.com/autowarefoundation/autoware.universe/tree/main/perception/autoware_radar_tracks_msgs_converter">radar_tracks_msgs_converter</a>.</p>
<h2 id="note">Note<a class="headerlink" href="#note" title="Permanent link">#</a></h2>
<h3 id="survey-for-radar-message">Survey for radar message<a class="headerlink" href="#survey-for-radar-message" title="Permanent link">#</a></h3>
<p>Depending on the sensor manufacturer and its purpose, each sensor might exchange raw, post-processed data. This section introduces a survey about the previously developed messaging systems in the open-source community. Although there are many kinds of outputs, radar mainly adopt two types as outputs, pointcloud and objects. Related discussion for message definition in ros-perception are <a href="https://github.com/ros-perception/radar_msgs/pull/1">PR #1</a>, <a href="https://github.com/ros-perception/radar_msgs/pull/2">PR #2</a>, and <a href="https://github.com/ros-perception/radar_msgs/pull/3">PR #3</a>. Existing open source softwares for radar are summarized in these PR.</p>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8350,7 +8350,7 @@ <h3 id="camera-launching">Camera Launching<a class="headerlink" href="#camera-la
for example, we will use <code>/perception/object_detection</code> as tensorrt_yolo node namespace,
it will be explained in autoware usage section.
For more information,
please check <a href="https://github.com/autowarefoundation/autoware.universe/tree/main/perception/image_projection_based_fusion">image_projection_based_fusion</a> package.</li>
please check <a href="https://github.com/autowarefoundation/autoware.universe/tree/main/perception/autoware_image_projection_based_fusion">image_projection_based_fusion</a> package.</li>
</ul>
<p>After the preparing <code>camera_node_container.launch.py</code> to our forked <code>common_sensor_launch</code> package,
we need to build the package:</p>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -7780,7 +7780,7 @@ <h3 id="perception">Perception<a class="headerlink" href="#perception" title="Pe
<p>If you want to use traffic light recognition and visualization,
you can set <code>traffic_light_recognition/enable_fine_detection</code> as true (default).
Please check
<a href="https://autowarefoundation.github.io/autoware.universe/main/perception/traffic_light_fine_detector/">traffic_light_fine_detector</a>
<a href="https://autowarefoundation.github.io/autoware.universe/main/perception/autoware_traffic_light_fine_detector/">traffic_light_fine_detector</a>
page for more information.
If you don't want to use traffic light classifier, then you can disable it:</p>
<div class="highlight"><pre><span></span><code><span class="gd">- &lt;arg name="traffic_light_recognition/enable_fine_detection" default="true" description="enable traffic light fine detection"/&gt;</span>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -7627,7 +7627,7 @@ <h2 id="tier4_perception_componentlaunchxml">tier4_perception_component.launch.x
Here are some predefined perception launch arguments:</p>
<ul>
<li>
<p><strong><code>occupancy_grid_map_method:</code></strong> This argument determines the occupancy grid map method for perception stack. Please check <a href="https://autowarefoundation.github.io/autoware.universe/main/perception/probabilistic_occupancy_grid_map/">probabilistic_occupancy_grid_map</a> package for detailed information.
<p><strong><code>occupancy_grid_map_method:</code></strong> This argument determines the occupancy grid map method for perception stack. Please check <a href="https://autowarefoundation.github.io/autoware.universe/main/perception/autoware_probabilistic_occupancy_grid_map/">probabilistic_occupancy_grid_map</a> package for detailed information.
The default probabilistic occupancy grid map method is <code>pointcloud_based_occupancy_grid_map</code>.
If you want to change it to the <code>laserscan_based_occupancy_grid_map</code>, you can change it here:</p>
<div class="highlight"><pre><span></span><code><span class="gd">- &lt;arg name="occupancy_grid_map_method" default="pointcloud_based_occupancy_grid_map" description="options: pointcloud_based_occupancy_grid_map, laserscan_based_occupancy_grid_map"/&gt;</span>
Expand All @@ -7638,7 +7638,7 @@ <h2 id="tier4_perception_componentlaunchxml">tier4_perception_component.launch.x
<ul>
<li>
<p><strong><code>detected_objects_filter_method:</code></strong> This argument determines the filter method for detected objects.
Please check <a href="https://autowarefoundation.github.io/autoware.universe/main/perception/detected_object_validation/">detected_object_validation</a> package for detailed information about lanelet and position filter.
Please check <a href="https://autowarefoundation.github.io/autoware.universe/main/perception/autoware_detected_object_validation/">detected_object_validation</a> package for detailed information about lanelet and position filter.
The default detected object filter method is <code>lanelet_filter</code>.
If you want to change it to the <code>position_filter</code>, you can change it here:</p>
<div class="highlight"><pre><span></span><code><span class="gd">- &lt;arg name="detected_objects_filter_method" default="lanelet_filter" description="options: lanelet_filter, position_filter"/&gt;</span>
Expand All @@ -7649,7 +7649,7 @@ <h2 id="tier4_perception_componentlaunchxml">tier4_perception_component.launch.x
<ul>
<li>
<p><strong><code>detected_objects_validation_method:</code></strong> This argument determines the validation method for detected objects.
Please check <a href="https://autowarefoundation.github.io/autoware.universe/main/perception/detected_object_validation/">detected_object_validation</a> package for detailed information about validation methods.
Please check <a href="https://autowarefoundation.github.io/autoware.universe/main/perception/autoware_detected_object_validation/">detected_object_validation</a> package for detailed information about validation methods.
The default detected object filter method is <code>obstacle_pointcloud</code>.
If you want to change it to the <code>occupancy_grid</code>, you can change it here,
but remember it requires <code>laserscan_based_occupancy_grid_map</code> method as <code>occupancy_grid_map_method</code>:</p>
Expand Down Expand Up @@ -7687,7 +7687,7 @@ <h2 id="perceptionlaunchxml">perception.launch.xml<a class="headerlink" href="#p
<ul>
<li>
<p><strong><code>remove_unknown:</code></strong> This parameter determines the remove unknown objects at camera-lidar fusion.
Please check <a href="https://github.com/autowarefoundation/autoware.universe/blob/main/perception/image_projection_based_fusion/docs/roi-cluster-fusion.md">roi_cluster_fusion</a> node for detailed information.
Please check <a href="https://github.com/autowarefoundation/autoware.universe/blob/main/perception/autoware_image_projection_based_fusion/docs/roi-cluster-fusion.md">roi_cluster_fusion</a> node for detailed information.
The default value is <code>true</code>.
If you want to change it to the <code>false</code>,
you can add this argument to <code>tier4_perception_component.launch.xml</code>,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -7644,7 +7644,7 @@ <h2 id="running-2d3d-object-detection-without-cuda">Running 2D/3D object detecti
<li><code>lidar-centerpoint</code> + <code>tensorrt_yolo</code></li>
<li><code>euclidean_cluster</code></li>
</ul>
<p>Of these five configurations, only the last one (<code>euclidean_cluster</code>) can be run without CUDA. For more details, refer to the <a href="https://github.com/autowarefoundation/autoware.universe/tree/main/perception/euclidean_cluster"><code>euclidean_cluster</code> module's README file</a>.</p>
<p>Of these five configurations, only the last one (<code>euclidean_cluster</code>) can be run without CUDA. For more details, refer to the <a href="https://github.com/autowarefoundation/autoware.universe/tree/main/perception/autoware_euclidean_cluster"><code>euclidean_cluster</code> module's README file</a>.</p>
<h2 id="running-traffic-light-detection-without-cuda">Running traffic light detection without CUDA<a class="headerlink" href="#running-traffic-light-detection-without-cuda" title="Permanent link">#</a></h2>
<p>For traffic light recognition (both detection and classification), there are two modules that require CUDA:</p>
<ul>
Expand Down
Loading

0 comments on commit 46fb166

Please sign in to comment.