Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(pointcloud_preprocessor): support 3d distortion corrector for distortion corrector node #7031

Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions sensing/pointcloud_preprocessor/CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@ set(CGAL_DATA_DIR ".")

find_package(OpenCV REQUIRED)
find_package(Eigen3 REQUIRED)
find_package(Sophus REQUIRED)
find_package(Boost REQUIRED)
find_package(PCL REQUIRED)
find_package(CGAL REQUIRED COMPONENTS Core)
Expand All @@ -19,6 +20,7 @@ include_directories(
${Boost_INCLUDE_DIRS}
${PCL_INCLUDE_DIRS}
${EIGEN3_INCLUDE_DIRS}
${Sophus_INCLUDE_DIRS}
${OpenCV_INCLUDE_DIRS}
)

Expand Down Expand Up @@ -86,6 +88,7 @@ target_link_libraries(pointcloud_preprocessor_filter
faster_voxel_grid_downsample_filter
${Boost_LIBRARIES}
${OpenCV_LIBRARIES}
${Sophus_LIBRARIES}
${PCL_LIBRARIES}
)

Expand Down
41 changes: 21 additions & 20 deletions sensing/pointcloud_preprocessor/docs/distortion-corrector.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,46 +2,47 @@

## Purpose

The `distortion_corrector` is a node that compensates pointcloud distortion caused by ego vehicle's movement during 1 scan.
The `distortion_corrector` is a node that compensates for pointcloud distortion caused by the ego-vehicle's movement during 1 scan.

Since the LiDAR sensor scans by rotating an internal laser, the resulting point cloud will be distorted if the ego-vehicle moves during a single scan (as shown by the figure below). The node corrects this by interpolating sensor data using odometry of ego-vehicle.
Since the LiDAR sensor scans by rotating an internal laser, the resulting point cloud will be distorted if the ego-vehicle moves during a single scan (as shown by the figure below). The node corrects this by interpolating sensor data using the odometry of the ego-vehicle.

## Inner-workings / Algorithms

- Use the equations below (specific to the Velodyne 32C sensor) to obtain an accurate timestamp for each scan data point.
- Use twist information to determine the distance the ego-vehicle has traveled between the time that the scan started and the corrected timestamp of each point, and then correct the position of the point.
The node utilizes twist information (linear velocity and angular velocity) from the `~/input/twist` topic. If the user sets `use_imu` to true, the node will replace the twist's angular velocity with the angular velocity from IMU. Afterward, the node will undistort all of the points one by one based on the velocity information.

The offset equation is given by
$ TimeOffset = (55.296 \mu s _SequenceIndex) + (2.304 \mu s_ DataPointIndex) $
The node also supports two different kinds of distortion methods: 2D distortion correction and 3D distortion correction. The main difference is that the 2D distortion corrector only utilizes the x-axis of linear velocity and the z-axis of angular velocity to correct the points. On the other hand, the 3D distortion corrector utilizes all linear and angular velocity components to correct the points.

To calculate the exact point time, add the TimeOffset to the timestamp.
$ ExactPointTime = TimeStamp + TimeOffset $
Please note that the processing time difference between the two distortion methods is significant; the 3D corrector takes 50% more time than the 2D corrector. Therefore, it is recommended that in general cases, users should set `use_3d_distortion_correction` to `false`. However, in scenarios such as a vehicle going over speed bumps, using the 3D corrector can be beneficial.

![distortion corrector figure](./image/distortion_corrector.jpg)

## Inputs / Outputs

### Input

| Name | Type | Description |
| ---------------- | ------------------------------------------------ | ---------------- |
| `~/input/points` | `sensor_msgs::msg::PointCloud2` | reference points |
| `~/input/twist` | `geometry_msgs::msg::TwistWithCovarianceStamped` | twist |
| `~/input/imu` | `sensor_msgs::msg::Imu` | imu data |
| Name | Type | Description |
| ---------------- | ------------------------------------------------ | ---------------------------------- |
| `~/input/points` | `sensor_msgs::msg::PointCloud2` | Topic of the distorted pointcloud. |
| `~/input/twist` | `geometry_msgs::msg::TwistWithCovarianceStamped` | Topic of the twist information. |
| `~/input/imu` | `sensor_msgs::msg::Imu` | Topic of the IMU data. |

### Output

| Name | Type | Description |
| ----------------- | ------------------------------- | --------------- |
| `~/output/points` | `sensor_msgs::msg::PointCloud2` | filtered points |
| Name | Type | Description |
| ----------------- | ------------------------------- | ----------------------------------- |
| `~/output/points` | `sensor_msgs::msg::PointCloud2` | Topic of the undistorted pointcloud |

## Parameters

### Core Parameters

| Name | Type | Default Value | Description |
| ---------------------- | ------ | ------------- | ----------------------------------------------------------- |
| `timestamp_field_name` | string | "time_stamp" | time stamp field name |
| `use_imu` | bool | true | use gyroscope for yaw rate if true, else use vehicle status |
| Name | Type | Default Value | Description |
| ------------------------------ | -------- | ------------- | ----------------------------------------------------------- |
| `timestamp_field_name` | `string` | `time_stamp` | Name of time stamp field. |
| `use_imu` | `bool` | `true` | Use gyroscope for yaw rate if true, else use vehicle status |
| `use_3d_distortion_correction` | `bool` | | Use 3d correction if true, otherwise use 2d correction |

## Assumptions / Known limits

- The node requires that time synchronization works well between the pointcloud, twist, and IMU.
- If you want to use 3D distortion corrector without IMU, please check that the linear and angular velocity fields of your twist message are not empty.
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,9 @@
#ifndef POINTCLOUD_PREPROCESSOR__DISTORTION_CORRECTOR__DISTORTION_CORRECTOR_HPP_
#define POINTCLOUD_PREPROCESSOR__DISTORTION_CORRECTOR__DISTORTION_CORRECTOR_HPP_

#include <Eigen/Core>
#include <rclcpp/rclcpp.hpp>
#include <sophus/se3.hpp>

#include <geometry_msgs/msg/twist_stamped.hpp>
#include <geometry_msgs/msg/twist_with_covariance_stamped.hpp>
Expand Down Expand Up @@ -58,11 +60,17 @@ class DistortionCorrectorComponent : public rclcpp::Node
void onTwistWithCovarianceStamped(
const geometry_msgs::msg::TwistWithCovarianceStamped::ConstSharedPtr twist_msg);
void onImu(const sensor_msgs::msg::Imu::ConstSharedPtr imu_msg);

bool getTransform(
const std::string & target_frame, const std::string & source_frame,
tf2::Transform * tf2_transform_ptr);
bool getTransform(
const std::string & target_frame, const std::string & source_frame,
tf2::Transform * tf2_transform_ptr, Eigen::Matrix4f & eigen_transform);

bool undistortPointCloud(const tf2::Transform & tf2_base_link_to_sensor, PointCloud2 & points);
bool undistortPointCloud(
const tf2::Transform & tf2_base_link_to_sensor, Eigen::Matrix4f & eigen_base_link_to_sensor,
PointCloud2 & points);

rclcpp::Subscription<PointCloud2>::SharedPtr input_points_sub_;
rclcpp::Subscription<sensor_msgs::msg::Imu>::SharedPtr imu_sub_;
Expand All @@ -81,6 +89,7 @@ class DistortionCorrectorComponent : public rclcpp::Node
std::string base_link_frame_ = "base_link";
std::string time_stamp_field_name_;
bool use_imu_;
bool use_3d_distortion_correction_;
};

} // namespace pointcloud_preprocessor
Expand Down
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
// Copyright 2020 Tier IV, Inc.

Check warning on line 1 in sensing/pointcloud_preprocessor/src/distortion_corrector/distortion_corrector.cpp

View check run for this annotation

CodeScene Delta Analysis / CodeScene Cloud Delta Analysis (main)

❌ New issue: Overall Code Complexity

This module has a mean cyclomatic complexity of 7.00 across 7 functions. The mean complexity threshold is 4. This file has many conditional statements (e.g. if, for, while) across its implementation, leading to lower code health. Avoid adding more conditionals.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
Expand All @@ -16,6 +16,8 @@

#include "tier4_autoware_utils/math/trigonometry.hpp"

#include <tf2_eigen/tf2_eigen.hpp>

#include <deque>
#include <string>
#include <utility>
Expand All @@ -39,6 +41,7 @@
// Parameter
time_stamp_field_name_ = declare_parameter("time_stamp_field_name", "time_stamp");
use_imu_ = declare_parameter("use_imu", true);
use_3d_distortion_correction_ = declare_parameter("use_3d_distortion_correction", false);

// Publisher
undistorted_points_pub_ =
Expand Down Expand Up @@ -123,10 +126,13 @@
return;
}

Eigen::Matrix4f eigen_base_link_to_sensor;
tf2::Transform tf2_base_link_to_sensor{};
getTransform(points_msg->header.frame_id, base_link_frame_, &tf2_base_link_to_sensor);
getTransform(
points_msg->header.frame_id, base_link_frame_, &tf2_base_link_to_sensor,
eigen_base_link_to_sensor);

undistortPointCloud(tf2_base_link_to_sensor, *points_msg);
undistortPointCloud(tf2_base_link_to_sensor, eigen_base_link_to_sensor, *points_msg);
knzo25 marked this conversation as resolved.
Show resolved Hide resolved

if (debug_publisher_) {
auto pipeline_latency_ms =
Expand All @@ -151,6 +157,35 @@
}
}

bool DistortionCorrectorComponent::getTransform(
const std::string & target_frame, const std::string & source_frame,
tf2::Transform * tf2_transform_ptr, Eigen::Matrix4f & eigen_transform)
{
if (target_frame == source_frame) {
tf2_transform_ptr->setOrigin(tf2::Vector3(0.0, 0.0, 0.0));
tf2_transform_ptr->setRotation(tf2::Quaternion(0.0, 0.0, 0.0, 1.0));
eigen_transform = Eigen::Matrix4f::Identity();
return true;
}

try {
const auto transform_msg =
tf2_buffer_.lookupTransform(target_frame, source_frame, tf2::TimePointZero);
tf2::convert(transform_msg.transform, *tf2_transform_ptr);
eigen_transform = tf2::transformToEigen(transform_msg.transform).matrix().cast<float>();
} catch (const tf2::TransformException & ex) {
RCLCPP_WARN(get_logger(), "%s", ex.what());
RCLCPP_ERROR(
get_logger(), "Please publish TF %s to %s", target_frame.c_str(), source_frame.c_str());

tf2_transform_ptr->setOrigin(tf2::Vector3(0.0, 0.0, 0.0));
tf2_transform_ptr->setRotation(tf2::Quaternion(0.0, 0.0, 0.0, 1.0));
eigen_transform = Eigen::Matrix4f::Identity();
return false;
}
return true;
}

Check warning on line 187 in sensing/pointcloud_preprocessor/src/distortion_corrector/distortion_corrector.cpp

View check run for this annotation

CodeScene Delta Analysis / CodeScene Cloud Delta Analysis (main)

❌ New issue: Code Duplication

The module contains 2 functions with similar structure: DistortionCorrectorComponent::getTransform,DistortionCorrectorComponent::getTransform. Avoid duplicated, aka copy-pasted, code inside the module. More duplication lowers the code health.

bool DistortionCorrectorComponent::getTransform(
const std::string & target_frame, const std::string & source_frame,
tf2::Transform * tf2_transform_ptr)
Expand Down Expand Up @@ -178,136 +213,196 @@
}

bool DistortionCorrectorComponent::undistortPointCloud(
const tf2::Transform & tf2_base_link_to_sensor, PointCloud2 & points)
const tf2::Transform & tf2_base_link_to_sensor, Eigen::Matrix4f & eigen_base_link_to_sensor,
PointCloud2 & points)
{
if (points.data.empty() || twist_queue_.empty()) {
RCLCPP_WARN_STREAM_THROTTLE(
get_logger(), *get_clock(), 10000 /* ms */,
"input_pointcloud->points or twist_queue_ is empty.");
return false;
}

auto time_stamp_field_it = std::find_if(
std::cbegin(points.fields), std::cend(points.fields),
[this](const sensor_msgs::msg::PointField & field) {
return field.name == time_stamp_field_name_;
});
if (time_stamp_field_it == points.fields.cend()) {
RCLCPP_WARN_STREAM_THROTTLE(
get_logger(), *get_clock(), 10000 /* ms */,
"Required field time stamp doesn't exist in the point cloud.");
return false;
}

sensor_msgs::PointCloud2Iterator<float> it_x(points, "x");
sensor_msgs::PointCloud2Iterator<float> it_y(points, "y");
sensor_msgs::PointCloud2Iterator<float> it_z(points, "z");
sensor_msgs::PointCloud2ConstIterator<double> it_time_stamp(points, time_stamp_field_name_);

float theta{0.0f};
float x{0.0f};
float y{0.0f};
double prev_time_stamp_sec{*it_time_stamp};
const double first_point_time_stamp_sec{*it_time_stamp};

auto twist_it = std::lower_bound(
std::begin(twist_queue_), std::end(twist_queue_), first_point_time_stamp_sec,
[](const geometry_msgs::msg::TwistStamped & x, const double t) {
return rclcpp::Time(x.header.stamp).seconds() < t;
});
twist_it = twist_it == std::end(twist_queue_) ? std::end(twist_queue_) - 1 : twist_it;

decltype(angular_velocity_queue_)::iterator imu_it;
if (use_imu_ && !angular_velocity_queue_.empty()) {
imu_it = std::lower_bound(
std::begin(angular_velocity_queue_), std::end(angular_velocity_queue_),
first_point_time_stamp_sec, [](const geometry_msgs::msg::Vector3Stamped & x, const double t) {
return rclcpp::Time(x.header.stamp).seconds() < t;
});
imu_it =
imu_it == std::end(angular_velocity_queue_) ? std::end(angular_velocity_queue_) - 1 : imu_it;
}

const tf2::Transform tf2_base_link_to_sensor_inv{tf2_base_link_to_sensor.inverse()};
const Eigen::Matrix4f eigen_base_link_to_sensor_inv = eigen_base_link_to_sensor.inverse();

// For performance, do not instantiate `rclcpp::Time` inside of the for-loop
double twist_stamp = rclcpp::Time(twist_it->header.stamp).seconds();

// For performance, instantiate outside of the for-loop
tf2::Quaternion baselink_quat{};
tf2::Transform baselink_tf_odom{};
tf2::Vector3 point{};
tf2::Vector3 undistorted_point{};

// For performance, avoid transform computation if unnecessary
bool need_transform = points.header.frame_id != base_link_frame_;

// For performance, do not instantiate `rclcpp::Time` inside of the for-loop
double imu_stamp{0.0};
if (use_imu_ && !angular_velocity_queue_.empty()) {
imu_stamp = rclcpp::Time(imu_it->header.stamp).seconds();
}

// If there is a point that cannot be associated, record it to issue a warning
bool twist_time_stamp_is_too_late = false;
bool imu_time_stamp_is_too_late = false;

// For performance, instantiate outside of the for-loop
tf2::Quaternion baselink_quat{};
tf2::Transform baselink_tf_odom{};

// for 2d motion
tf2::Vector3 point{};
tf2::Vector3 undistorted_point{};
float theta{0.0f};
float x{0.0f}, y{0.0f};
float v{0.0f}, w{0.0f};

// for 3d motion
Eigen::Vector4f point_eigen;
Eigen::Vector4f undistorted_point_eigen;
Eigen::Matrix4f transformation_matrix;
Eigen::Matrix4f prev_transformation_matrix = Eigen::Matrix4f::Identity();

float v_x{0.0f}, v_y = {0.0f}, v_z = {0.0f}, w_x = {0.0f}, w_y = {0.0f}, w_z = {0.0f};

for (; it_x != it_x.end(); ++it_x, ++it_y, ++it_z, ++it_time_stamp) {
while (twist_it != std::end(twist_queue_) - 1 && *it_time_stamp > twist_stamp) {
++twist_it;
twist_stamp = rclcpp::Time(twist_it->header.stamp).seconds();
}

float v{static_cast<float>(twist_it->twist.linear.x)};
float w{static_cast<float>(twist_it->twist.angular.z)};
if (use_3d_distortion_correction_) {
v_x = static_cast<float>(twist_it->twist.linear.x);
v_y = static_cast<float>(twist_it->twist.linear.y);
v_z = static_cast<float>(twist_it->twist.linear.z);
w_x = static_cast<float>(twist_it->twist.angular.x);
w_y = static_cast<float>(twist_it->twist.angular.y);
w_z = static_cast<float>(twist_it->twist.angular.z);
} else {
v = static_cast<float>(twist_it->twist.linear.x);
w = static_cast<float>(twist_it->twist.angular.z);
}

if (std::abs(*it_time_stamp - twist_stamp) > 0.1) {
twist_time_stamp_is_too_late = true;
v = 0.0f;
w = 0.0f;

if (use_3d_distortion_correction_) {
v_x = 0.0f;
v_y = 0.0f;
v_z = 0.0f;
w_x = 0.0f;
w_y = 0.0f;
w_z = 0.0f;
} else {
v = 0.0f;
w = 0.0f;
}
}

if (use_imu_ && !angular_velocity_queue_.empty()) {
while (imu_it != std::end(angular_velocity_queue_) - 1 && *it_time_stamp > imu_stamp) {
++imu_it;
imu_stamp = rclcpp::Time(imu_it->header.stamp).seconds();
}

if (std::abs(*it_time_stamp - imu_stamp) > 0.1) {
imu_time_stamp_is_too_late = true;
} else {
w = static_cast<float>(imu_it->vector.z);
if (use_3d_distortion_correction_) {
w_x = static_cast<float>(imu_it->vector.x);
w_y = static_cast<float>(imu_it->vector.y);
w_z = static_cast<float>(imu_it->vector.z);
} else {
w = static_cast<float>(imu_it->vector.z);
}
}
}

const auto time_offset = static_cast<float>(*it_time_stamp - prev_time_stamp_sec);

point.setValue(*it_x, *it_y, *it_z);
if (use_3d_distortion_correction_) {
point_eigen << *it_x, *it_y, *it_z, 1.0;

if (need_transform) {
point = tf2_base_link_to_sensor_inv * point;
}
if (need_transform) {
point_eigen = eigen_base_link_to_sensor_inv * point_eigen;
}

theta += w * time_offset;
baselink_quat.setValue(
0, 0, tier4_autoware_utils::sin(theta * 0.5f),
tier4_autoware_utils::cos(theta * 0.5f)); // baselink_quat.setRPY(0.0, 0.0, theta);
const float dis = v * time_offset;
x += dis * tier4_autoware_utils::cos(theta);
y += dis * tier4_autoware_utils::sin(theta);
Sophus::SE3f::Tangent twist(v_x, v_y, v_z, w_x, w_y, w_z);
twist = twist * time_offset;
transformation_matrix = Sophus::SE3f::exp(twist).matrix();
transformation_matrix = transformation_matrix * prev_transformation_matrix;
undistorted_point_eigen = transformation_matrix * point_eigen;

baselink_tf_odom.setOrigin(tf2::Vector3(x, y, 0.0));
baselink_tf_odom.setRotation(baselink_quat);
if (need_transform) {
undistorted_point_eigen = eigen_base_link_to_sensor * undistorted_point_eigen;
}
*it_x = undistorted_point_eigen[0];
*it_y = undistorted_point_eigen[1];
*it_z = undistorted_point_eigen[2];

undistorted_point = baselink_tf_odom * point;
prev_transformation_matrix = transformation_matrix;
} else {
point.setValue(*it_x, *it_y, *it_z);

if (need_transform) {
undistorted_point = tf2_base_link_to_sensor * undistorted_point;
}
if (need_transform) {
point = tf2_base_link_to_sensor_inv * point;
}

theta += w * time_offset;
baselink_quat.setValue(
0, 0, tier4_autoware_utils::sin(theta * 0.5f),
tier4_autoware_utils::cos(theta * 0.5f)); // baselink_quat.setRPY(0.0, 0.0, theta);
const float dis = v * time_offset;
x += dis * tier4_autoware_utils::cos(theta);
y += dis * tier4_autoware_utils::sin(theta);

baselink_tf_odom.setOrigin(tf2::Vector3(x, y, 0.0));
baselink_tf_odom.setRotation(baselink_quat);

undistorted_point = baselink_tf_odom * point;

*it_x = static_cast<float>(undistorted_point.getX());
*it_y = static_cast<float>(undistorted_point.getY());
*it_z = static_cast<float>(undistorted_point.getZ());
if (need_transform) {
undistorted_point = tf2_base_link_to_sensor * undistorted_point;
}

*it_x = static_cast<float>(undistorted_point.getX());
*it_y = static_cast<float>(undistorted_point.getY());
*it_z = static_cast<float>(undistorted_point.getZ());
}

Check warning on line 405 in sensing/pointcloud_preprocessor/src/distortion_corrector/distortion_corrector.cpp

View check run for this annotation

CodeScene Delta Analysis / CodeScene Cloud Delta Analysis (main)

❌ Getting worse: Complex Method

DistortionCorrectorComponent::undistortPointCloud increases in cyclomatic complexity from 23 to 29, threshold = 9. This function has many conditional statements (e.g. if, for, while), leading to lower code health. Avoid adding more conditionals and code to it without refactoring.

Check warning on line 405 in sensing/pointcloud_preprocessor/src/distortion_corrector/distortion_corrector.cpp

View check run for this annotation

CodeScene Delta Analysis / CodeScene Cloud Delta Analysis (main)

❌ Getting worse: Bumpy Road Ahead

DistortionCorrectorComponent::undistortPointCloud increases from 2 to 6 logical blocks with deeply nested code, threshold is one single block per function. The Bumpy Road code smell is a function that contains multiple chunks of nested conditional logic. The deeper the nesting and the more bumps, the lower the code health.

Check warning on line 405 in sensing/pointcloud_preprocessor/src/distortion_corrector/distortion_corrector.cpp

View check run for this annotation

CodeScene Delta Analysis / CodeScene Cloud Delta Analysis (main)

❌ New issue: Deep, Nested Complexity

DistortionCorrectorComponent::undistortPointCloud has a nested complexity depth of 4, threshold = 4. This function contains deeply nested logic such as if statements and/or loops. The deeper the nesting, the lower the code health.

prev_time_stamp_sec = *it_time_stamp;
}
Expand Down
Loading