From cbde4179503f4bb2bddff5c3f1c3f3ece1b6f38a Mon Sep 17 00:00:00 2001 From: okrusch <102369315+okrusch@users.noreply.github.com> Date: Sat, 2 Dec 2023 10:31:09 +0100 Subject: [PATCH 1/5] Create LIDAR_data.md --- doc/03_research/02_perception/LIDAR_data.md | 59 +++++++++++++++++++++ 1 file changed, 59 insertions(+) create mode 100644 doc/03_research/02_perception/LIDAR_data.md diff --git a/doc/03_research/02_perception/LIDAR_data.md b/doc/03_research/02_perception/LIDAR_data.md new file mode 100644 index 00000000..a5f5a282 --- /dev/null +++ b/doc/03_research/02_perception/LIDAR_data.md @@ -0,0 +1,59 @@ +# LIDAR-Data + +This File discusses where the LIDAR-Data comes from, how its processed and how we could possibly use it. + +## Origin + +LIDAR-Data comes in Pointclouds from a specific LIDAR-Topic. + +`rospy.Subscriber(rospy.get_param('~source_topic', "/carla/hero/LIDAR"), + PointCloud2, self.callback)` + +## Processing + +The goal is to identify Objects and their distance. Therefor we need to calculate distances from the pointcloud data. +To do this the lidar-distance node first converts pointcloud data to an array, which contains cartesian coordinates. + +`paf23-agent-1 | (76.12445 , -1.6572031e+01, 13.737187 , 0.7287409 )` + + +`paf23-agent-1 | (71.9434 , -1.8718828e+01, 13.107929 , 0.7393809 )` + + +`paf23-agent-1 | (-0.3482422 , -1.6367188e-02, -0.20128906, 0.99839103)` + + +`paf23-agent-1 | (-0.3486328 , -1.4062500e-02, -0.20152344, 0.99838954)` + + +`paf23-agent-1 | (-0.35070312, -2.3828126e-03, -0.2025 , 0.99838144)` + +The first three values of each row correspon to x, y, z. + +x - the X Cartesian coordinate of a point (float32) + +y - the Y Cartesian coordinate of a point (float32) + +z - the Z Cartesian coordinate of a point (float32) + +It wasn´t specified anywhere, what the 4th values represents. My best guess is some sort of intensity. + + +## Distance Calculation + +The distance to a point is calculated by the euclidian distance to (0,0,0) for every point in the point cloud. + +`distances = np.array( + [np.linalg.norm(c - [0, 0, 0]) for c in coordinates_xyz])` + +They then publish the minimum and maximum distance. + + +## Open questions + +1. Currently we have no specified unit. + **Is a relative distance enough?** + +2. How do we translate cartesian coordinates to pixel in the Image and vice-versa? + + From 4f006efe2493e758c8c4bc3ce73ed7aea2e0716d Mon Sep 17 00:00:00 2001 From: okrusch <102369315+okrusch@users.noreply.github.com> Date: Sun, 3 Dec 2023 10:58:33 +0100 Subject: [PATCH 2/5] Update LIDAR_data.md --- doc/03_research/02_perception/LIDAR_data.md | 2 ++ 1 file changed, 2 insertions(+) diff --git a/doc/03_research/02_perception/LIDAR_data.md b/doc/03_research/02_perception/LIDAR_data.md index a5f5a282..a48b8da6 100644 --- a/doc/03_research/02_perception/LIDAR_data.md +++ b/doc/03_research/02_perception/LIDAR_data.md @@ -9,6 +9,8 @@ LIDAR-Data comes in Pointclouds from a specific LIDAR-Topic. `rospy.Subscriber(rospy.get_param('~source_topic', "/carla/hero/LIDAR"), PointCloud2, self.callback)` +Read more about the LIDAR-Sensor [here](https://github.com/una-auxme/paf23/blob/main/doc/06_perception/03_lidar_distance_utility.md) + ## Processing The goal is to identify Objects and their distance. Therefor we need to calculate distances from the pointcloud data. From 37e606a1533548f2413aae87df988fa712bf6fc1 Mon Sep 17 00:00:00 2001 From: okrusch <102369315+okrusch@users.noreply.github.com> Date: Sun, 3 Dec 2023 11:05:05 +0100 Subject: [PATCH 3/5] Update LIDAR_data.md --- doc/03_research/02_perception/LIDAR_data.md | 2 ++ 1 file changed, 2 insertions(+) diff --git a/doc/03_research/02_perception/LIDAR_data.md b/doc/03_research/02_perception/LIDAR_data.md index a48b8da6..343d3918 100644 --- a/doc/03_research/02_perception/LIDAR_data.md +++ b/doc/03_research/02_perception/LIDAR_data.md @@ -58,4 +58,6 @@ They then publish the minimum and maximum distance. 2. How do we translate cartesian coordinates to pixel in the Image and vice-versa? + We should be able to set the bounding box of the LIDAR-PointCloud2 to the camera-calibration. + From dcd0387553b0bf2618b6e872da63c2f4304692f4 Mon Sep 17 00:00:00 2001 From: okrusch <102369315+okrusch@users.noreply.github.com> Date: Sun, 3 Dec 2023 11:19:01 +0100 Subject: [PATCH 4/5] Update LIDAR_data.md --- doc/03_research/02_perception/LIDAR_data.md | 15 ++++----------- 1 file changed, 4 insertions(+), 11 deletions(-) diff --git a/doc/03_research/02_perception/LIDAR_data.md b/doc/03_research/02_perception/LIDAR_data.md index 343d3918..090bf4ab 100644 --- a/doc/03_research/02_perception/LIDAR_data.md +++ b/doc/03_research/02_perception/LIDAR_data.md @@ -4,7 +4,7 @@ This File discusses where the LIDAR-Data comes from, how its processed and how w ## Origin -LIDAR-Data comes in Pointclouds from a specific LIDAR-Topic. +LIDAR-Data comes in Pointclouds from a specific LIDAR-Topic. `rospy.Subscriber(rospy.get_param('~source_topic', "/carla/hero/LIDAR"), PointCloud2, self.callback)` @@ -18,16 +18,12 @@ To do this the lidar-distance node first converts pointcloud data to an array, w `paf23-agent-1 | (76.12445 , -1.6572031e+01, 13.737187 , 0.7287409 )` - `paf23-agent-1 | (71.9434 , -1.8718828e+01, 13.107929 , 0.7393809 )` - `paf23-agent-1 | (-0.3482422 , -1.6367188e-02, -0.20128906, 0.99839103)` - `paf23-agent-1 | (-0.3486328 , -1.4062500e-02, -0.20152344, 0.99838954)` - `paf23-agent-1 | (-0.35070312, -2.3828126e-03, -0.2025 , 0.99838144)` The first three values of each row correspon to x, y, z. @@ -36,11 +32,10 @@ x - the X Cartesian coordinate of a point (float32) y - the Y Cartesian coordinate of a point (float32) -z - the Z Cartesian coordinate of a point (float32) +z - the Z Cartesian coordinate of a point (float32) It wasn´t specified anywhere, what the 4th values represents. My best guess is some sort of intensity. - ## Distance Calculation The distance to a point is calculated by the euclidian distance to (0,0,0) for every point in the point cloud. @@ -48,8 +43,7 @@ The distance to a point is calculated by the euclidian distance to (0,0,0) for e `distances = np.array( [np.linalg.norm(c - [0, 0, 0]) for c in coordinates_xyz])` -They then publish the minimum and maximum distance. - +They then publish the minimum and maximum distance. ## Open questions @@ -59,5 +53,4 @@ They then publish the minimum and maximum distance. 2. How do we translate cartesian coordinates to pixel in the Image and vice-versa? We should be able to set the bounding box of the LIDAR-PointCloud2 to the camera-calibration. - - + From abb054bfeee040c98814856c68a55563f79acd5f Mon Sep 17 00:00:00 2001 From: okrusch <102369315+okrusch@users.noreply.github.com> Date: Sun, 3 Dec 2023 11:20:27 +0100 Subject: [PATCH 5/5] Update LIDAR_data.md --- doc/03_research/02_perception/LIDAR_data.md | 1 - 1 file changed, 1 deletion(-) diff --git a/doc/03_research/02_perception/LIDAR_data.md b/doc/03_research/02_perception/LIDAR_data.md index 090bf4ab..528620dc 100644 --- a/doc/03_research/02_perception/LIDAR_data.md +++ b/doc/03_research/02_perception/LIDAR_data.md @@ -53,4 +53,3 @@ They then publish the minimum and maximum distance. 2. How do we translate cartesian coordinates to pixel in the Image and vice-versa? We should be able to set the bounding box of the LIDAR-PointCloud2 to the camera-calibration. -