Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SampleConcensusModel fails on lidar points #27

Open
pirobot opened this issue Jan 15, 2019 · 12 comments
Open

SampleConcensusModel fails on lidar points #27

pirobot opened this issue Jan 15, 2019 · 12 comments

Comments

@pirobot
Copy link

pirobot commented Jan 15, 2019

No description provided.

@pirobot
Copy link
Author

pirobot commented Jan 15, 2019

Thanks for making this package available! I am trying it out using Gazebo 7 under ROS Kinetic and Ubuntu 16.04. You can see the simulation setup below:

image

The circles are detected OK by the simulated camera. See the Hough transform ouput below:

image

However, the the circles are not detected in the lidar pointcloud by the sphere consensus model. Instead, I get the following stream of errors:

[pcl::SampleConsensusModel::getSamples] Can not select 0 unique points out of 0!
[pcl::RandomSampleConsensus::computeModel] No samples could be selected!
[pcl::SampleConsensusModel::getSamples] Can not select 0 unique points out of 0!
[pcl::RandomSampleConsensus::computeModel] No samples could be selected!
etc

I can visualize the pointcloud being processed by Calibration3DMarker.cpp. Right after the following code:

Velodyne::Velodyne scan(pc);
scan.getRings();
scan.intensityByRangeDiff();

the pointcloud looks like this:

image

So far so good. But after the projection onto the camera plane, the pointcloud looks like the following:

PointCloud<Velodyne::Point> visible_cloud;
scan.project(P, Rect(0, 0, 672, 376), &visible_cloud);

image

So now we can no longer see the marker at all.

I understand the projection is using the camera_info topic which is shown below. This matrix is generated by the Gazebo camera plugin:

header: 
  seq: 3855
  stamp: 
    secs: 14189
    nsecs: 762000000
  frame_id: "camera_rgb_optical_frame"
height: 376
width: 672
distortion_model: "plumb_bob"
D: [0.0, 0.0, 0.0, 0.0, 0.0]
K: [335.998765805352, 0.0, 336.5, 0.0, 335.998765805352, 188.5, 0.0, 0.0, 1.0]
R: [1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0]
P: [335.998765805352, 0.0, 336.5, -0.0, 0.0, 335.998765805352, 188.5, 0.0, 0.0, 0.0, 1.0, 0.0]
binning_x: 0
binning_y: 0
roi: 
  x_offset: 0
  y_offset: 0
  height: 0
  width: 0
  do_rectify: False

Can you think of anything that might be going wrong?

@martin-velas
Copy link
Contributor

Hi! Hmmm - my code assumes that both the LiDAR and the camera point to the same direction. The projection shows, that in your setup probably do not. I would try to add rotation (around vertical axis) to fulfill this assumption.

@pirobot
Copy link
Author

pirobot commented Jan 15, 2019

Thanks for your reply! The frames of our LiDAR and camera are shown below. The frame for the camera is the rgb_optical frame. So for the LiDAR the z axis points up, the x-axis points forward and the y-axis points to the left. For the camera optical frame, the z-axis points forward, the x-axis points to the right and the y-axis points down. How are the axes set up in your robot?

image

image

@pirobot
Copy link
Author

pirobot commented Jan 16, 2019

OK, I think I figured out how to align the camera and the LiDAR so now the pointcloud looks correct. However, in the function detect4spheres() in Calibration3DMarker.cpp I can view the input plane and it looks like this:

image

This looks like the LiDAR shadow cast by maker on the wall behind the marker. Shouldn't the plane be the marker itself?

@martin-velas
Copy link
Contributor

Yes you are right - it should be the marker not the "shadow". The thing is, that is marker is assumed to be best fitting plane in front of the LiDAR. Since in your simulation there is large ideal wall behind the circle, the algorithm is confused. I would add some clutter (boxes) or make the wall smaller or curved.

@pirobot
Copy link
Author

pirobot commented Jan 22, 2019

Thanks @martin-velas. I tried filtering the Velodyne pointcloud so that only the target is visible as shown in the image below:

image

Now I seem to get a step further and the output from the course calibration looks like this:

`Sphere Inliers: 13

  1. circle: 0.313761
    0.0980946
    1.94556
    0.116335

Sphere Inliers: 9
2. circle: 0.749703
-0.347437
1.89251
0.1113

Sphere Inliers: 19
3. circle: 0.728928
-0.192599
1.82317
0.129315

Sphere Inliers: 19
4. circle: 0.00977266
0.105911
1.99999
0.120074

ordered centers:
(0.728928,-0.192599,1.82317)
(0.749703,-0.347437,1.89251)
(0.00977266,0.105911,1.99999)
(0.313761,0.0980946,1.94556)

etc.
Bad PointCloud Detection!
[ WARN] [1548166629.226554706, 13910.150000000]: Calibration failed - trying again after 5s ...

`
So the calibration always fails with the "Bad PointCloud Detection" message as shown above. I've tried changing the sphere model tolerance (0.005 to 0.5) and the min/max radius (0.09/0.14) since my target circles have a radius of 0.12 m and are 0.3 meters apart.

@SubMishMar
Copy link

@pirobot how did you fix the camera and lidar orientation? And where in code can I do it?

@pirobot
Copy link
Author

pirobot commented Apr 4, 2019

@SubMishMar Sorry I honestly don't remember and I had to move on to a different approach as we could not get this to work on the real robot.

@pirobot pirobot closed this as completed Apr 4, 2019
@SubMishMar
Copy link

@pirobot thanks for responding. What different approach do/did you use?

@pirobot
Copy link
Author

pirobot commented Apr 9, 2019

We are trying the velo2cam_calibration package but so far we cannot get very good results on the real robot (the Gazebo demo works well).

@junzhang2016
Copy link

We are trying the velo2cam_calibration package but so far we cannot get very good results on the real robot (the Gazebo demo works well).

Yes, I got the same issue. velo2cam_calibration works well in Gazebo, but in real environments, it takes some effort.

@AMINE-GAIDI
Copy link

AMINE-GAIDI commented Oct 11, 2020

Hi everyone, thank you for this discussion, @pirobot can you please shared your calibration on gazebo for your robot?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants