Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

IMU problem with Mono-Inertial mode using ORB SLAM3 with Gazebo and RViz2 #910

Open
Gwenunu opened this issue Jun 25, 2024 · 10 comments
Open

Comments

@Gwenunu
Copy link

Gwenunu commented Jun 25, 2024

Hello everyone and thank you for your work.

I would like to map a simulated environment on a gazebo, using a simulated robot that corresponds as closely as possible to the one used in real life.

To do this, I'm using ORB SLAM3 in monoinertial mode, but I'm encountering a few difficulties.

In particular, when initializing the IMU, it's very complicated to initialize it, and I get this kind of message in a loop:

Capture d’écran du 2024-06-25 17-48-52

And when I do, the results are correct at first, then inaccurate :

Capture d’écran du 2024-06-25 17-54-27
Capture d’écran du 2024-06-24 16-40-12

My settings look like this:

#--------------------------------------------------------------------------------------------
# Camera Parameters.
#--------------------------------------------------------------------------------------------
Camera.type: "PinHole"

# Camera calibration and distortion parameters (OpenCV) 
Camera.fx: 312.8884288345382
Camera.fy: 312.8884288345382
Camera.cx: 312.28302001953125
Camera.cy: 135.72957611083984

Camera.k1: 0.0
Camera.k2: 0.0
Camera.p1: 0.0
Camera.p2: 0.0

# Camera resolution
Camera.width: 500
Camera.height: 300

# Camera frames per second 
Camera.fps: 20.0

# Color order of the images (0: BGR, 1: RGB. It is ignored if images are grayscale)
Camera.RGB: 0

# Image scale, it changes the image size to be processed (<1.0: reduce, >1.0: increase)
Camera.imageScale: 1.0

#--------------------------------------------------------------------------------------------
# IMU Parameters. 
#--------------------------------------------------------------------------------------------

# Acceleration noise density (continuous-time)
IMU.NoiseAcc: 0.016478969528        # [m/s^2/sqrt(Hz)]

# Gyroscope noise density (continuous-time)
IMU.NoiseGyro: 0.003607190298     # [rad/s/sqrt(Hz)]

# Acceleration random walk (continuous-time)
IMU.AccWalk: 0.00464687056      # [m/s^3/sqrt(Hz)]

# Gyroscope random walk (continuous-time)
IMU.GyroWalk: 0.0003286612     # [rad/s^2/sqrt(Hz)]

# IMU Frequency
IMU.Frequency: 600

# Color order of the images (0: BGR, 1: RGB. It is ignored if images are grayscale)
Camera.RGB: 0

# Transformation from camera to body-frame (imu)
Tbc: !!opencv-matrix
   rows: 4
   cols: 4
   dt: f
   data: [  1.0, 1.55768e-06, -3.32658e-06, -0.0414102,
            3.67321e-06, -0.42406499, 0.905632, 0.058199801 ,
            2.41798e-17, -0.90563202, -0.42406499, 0.041999999,
            0.0, 0.0, 0.0, 1.0      ]

   # data: [  1.0, 0.0, 0.0, 0.04141,
   #          0.0, 1.0, 0.0, -0.0582,
   #          0.0, 0.0, 1.0, 0.042,
   #          0.0, 0.0, 0.0, 1.0      ]


#--------------------------------------------------------------------------------------------
# ORB Parameters
#--------------------------------------------------------------------------------------------

# ORB Extractor: Number of features per image
ORBextractor.nFeatures: 1000 # 1000

# ORB Extractor: Scale factor between levels in the scale pyramid 	
ORBextractor.scaleFactor: 1.2

# ORB Extractor: Number of levels in the scale pyramid	
ORBextractor.nLevels: 8

# ORB Extractor: Fast threshold
# Image is divided in a grid. At each cell FAST are extracted imposing a minimum response.
# Firstly we impose iniThFAST. If no corners are detected we impose a lower value minThFAST
# You can lower these values if your images have low contrast			
ORBextractor.iniThFAST: 20
ORBextractor.minThFAST: 7

#--------------------------------------------------------------------------------------------
# Viewer Parameters
#--------------------------------------------------------------------------------------------
Viewer.KeyFrameSize: 0.05
Viewer.KeyFrameLineWidth: 1
Viewer.GraphLineWidth: 0.9
Viewer.PointSize:2
Viewer.CameraSize: 0.08
Viewer.CameraLineWidth: 3
Viewer.ViewpointX: 0
Viewer.ViewpointY: -0.7
Viewer.ViewpointZ: -3.5 # -1.8
Viewer.ViewpointF: 500

I use :
Ros2 Humble
Gazebo Classic 11
Ubuntu 22.04

Furthermore, I would like to display these point clouds (correct or not), on RViz, but I don't know how to do it. I'm using OrbSlam3 Ros2 Wrapper. If anyone has an idea, that would be very kind.

Don't hesitate if you're missing information needed to solve this problem.

Thank you for your time.

@Gwenunu Gwenunu changed the title Utiliser ORB SLAM3 avec Gazebo et RViz2 Using ORB SLAM3 with Gazebo and RViz2 Jun 25, 2024
@Gwenunu Gwenunu changed the title Using ORB SLAM3 with Gazebo and RViz2 IMU problem with Mono-Inertial mode using ORB SLAM3 with Gazebo and RViz2 Jun 25, 2024
@WFram
Copy link

WFram commented Jun 25, 2024

Here there might be at least two concerns:

  1. As you can see from the logger, the visual-inertial mode ORB-SLAM struggles with initializing given IMU readings. I'm not surprised, you're using a planar robot, which you control with mostly constant acceleration. In addition to this, as the motion is planar, there is no excitation along Z axis and pitch and roll angles are always zero. Given monocular camera it's really difficult to do SLAM using IMU data. Have you considered using stereo camera? Or refusing of IMU and using wheel odometry data? The last one is not supported in ORB-SLAM, but there are modifications.

  2. Your scene is simulated and contains very similar visual structure. Using Loop Closing here is a bad idea. There are lots of candidates for loop closing and once they get accepted and loop closing occurs, your map gets broken, like in the screenshot you sent. Don't use Loop Closing in such scenes; you can turn it off in the file which you sent.

@Gwenunu
Copy link
Author

Gwenunu commented Jun 26, 2024

Hello, thank you for your reply.

I was using a mono camera for "simplicity" as I'm just starting out, I've never done vision before. But the real robot has two cameras, so I'll try it that way.

When you say to disable the loop cloosing, are you referring to this kind of thing: #109 ?

Thanks for your time, have a nice day

@WFram
Copy link

WFram commented Jun 26, 2024

About loop closing, yes; I forgot that there is no switching off in the YAML.
By the way, in terms of simplicity, stereo camera is much better starting point, in terms of SLAM or visual odometry, as with stereo you're able to initialize the scale of the scene immediately and avoid scale drift.

@Gwenunu
Copy link
Author

Gwenunu commented Jun 27, 2024

Good evening, thank you for your reply.

I'm going to work on SLAM in stereo-inertial, I've started but for the moment I have a problem with the calibration of the cameras, I don't know if it's in the model of the simulated robot in *.SDF or in the *.YAML, or both?

I'll repost if I get any results, or even more questions.

Thanks again, have a nice day/evening

@WFram
Copy link

WFram commented Jun 27, 2024

Hello; well, it depends on the simulator you're using; I think the calibration can be obtained from the URDF model of your robots; depends on your robot description.
Just find out the camera model (yours is pinhole I guess), intrinsics parameters (fx, fy, cx, cy), distortion parameters (k1, k2, k3, k4), image size and use them in the .yaml file with ORB-SLAM settings.

@Gwenunu
Copy link
Author

Gwenunu commented Jul 1, 2024

Hello, your help has been invaluable, and I need your knowledge again.

The viewers work, I did a test in simple stereo to test the cameras only and I get a really correct map.

It's when I switch back to inertial stereo that things get complicated, because I can't "calibrate" the IMU, even though I'm using a simulation and I'm in control of the precision, even when I set low noise values, I have the same problems.

I have the camera in the viewer moving at high speed, even when stationary, which gives me a completely random map.

Capture d’écran du 2024-07-01 12-24-21

Thanks you.

@WFram
Copy link

WFram commented Jul 1, 2024

Well, with IMU such "performance" can be obtained in the following cases:

  1. IMU extrinsics. Make sure you set Tbc in the expected convention. If it requires camera -> imu format, make sure the matrix is the transformation from camera to IMU.
  2. IMU noises. Actually setting low values can make performance worse, if IMU is erroneous. Ideally, calibrate them. But try to inflate them, by multiplying by 10, 100, etc. (not all at once, but separately).
  3. Time synchronization. Maybe your images and IMU has a large timeshift. You can calibrate it as well as all IMU parameters you need, using Kalibr.
  4. Gravity units. I think ORB-SLAM expects them in m/s^2, but you'll have to check this. Sometimes units of measurements are g instead of m/s^2. Just multiply by gravity magnitude, if it's your case.
  5. Empty IMU measurements. Actually, sometimes there is no IMU measurement between two camera frames. Check that the IMU frequency is high enough (at least 200 Hz) and you might have want to visualize your IMU data.

I would check (1), (4) and (5) first. Then, (2). And if doesn't help, check (3).
By the way, is it still simulated data?

@Gwenunu
Copy link
Author

Gwenunu commented Jul 3, 2024

Hello,
I have followed your recommendations, my Tbc matrix comes from my URDF, obtained with ros2 run tf2_ros tf2_echo [source_frame] [target_frame]. I get :

At time 0.0
- Translation: [0.079, 0.042, -0.058]
- Rotation: in Quaternion [0.000, -0.707, 0.707, 0.000]
- Rotation: in RPY (radian) [-1.571, -0.000, 3.142]
- Rotation: in RPY (degree) [-90.000, -0.000, 180.000]
- Matrix:
 -1.000 -0.000  0.000  0.079
  0.000 -0.000 -1.000  0.042
  0.000 -1.000  0.000 -0.058
  0.000  0.000  0.000  1.000

My measurements are indeed in m/s².

I also calibrated the real IMU with a calibration node, and then recovered this data, which I inserted into the simulation's *.SDF model, to start again with "reasonable" data.

But I have a "problem", concerning the frequency of the IMU. The higher the frequency, the more noise I get, especially visible on the linear acceleration along the z axis. So I had to go down to 100Hz to get low noise. Is this the right method to follow?
100 Hz :
Capture d’écran du 2024-07-03 16-17-32

200Hz :
200hz

400Hz :
400Hz

1000Hz :
1000Hz

I read on this issue that Gazebo's physical engines generated noise, so I wanted to change, but this led to complications with the differential drive control plugins, so I stayed with the ODE engine.

I'm getting good results at the moment, and I'd like to come back to this 'Loop Closing' thing, because I'm not sure I've understood how to disable it, other than in the *.YAML parameters. At the moment I'm having problems with it, once I've made a turn with a correct map, I get :

Capture d’écran du 2024-07-03 16-16-39

And everything goes wrong.


Here my IMU sensor simulated :

          <sensor name="imu_jetson" type="imu">
            <!-- <pose>0 0 0 0 0 0</pose> -->
            <always_on>true</always_on>
            <update_rate>100</update_rate>
            <visualize>true</visualize>
            <enable_metrics>true</enable_metrics>

                <imu>
                  <angular_velocity>
                    <x>
                      <noise type="gaussian">
                        <mean>-0.002660</mean>
                        <stddev>0.003562</stddev>
                      </noise>
                    </x>
                    <y>
                      <noise type="gaussian">
                        <mean>0.007384</mean>
                        <stddev>0.000654</stddev>
                      </noise>
                    </y>
                    <z>
                      <noise type="gaussian">
                        <mean>-0.006493</mean>
                        <stddev>0.000485</stddev>
                      </noise>
                    </z>
                  </angular_velocity>

                  <linear_acceleration>
                    <x>
                      <noise type="gaussian">
                        <mean>-0.913523</mean>
                        <stddev>0.024878</stddev>
                      </noise>
                    </x>
                    <y>
                      <noise type="gaussian">
                        <mean>0.515525</mean>
                        <stddev>0.013044</stddev>
                      </noise>
                    </y>
                    <z>
                      <noise type="gaussian">
                        <mean>0.0</mean>
                        <stddev>0.024728</stddev>
                      </noise>
                    </z>
                  </linear_acceleration>

                </imu>
                <plugin name="imu_jetson" filename="libgazebo_ros_imu_sensor.so">
                  <initial_orientation_as_reference>false</initial_orientation_as_reference>
                </plugin>
              </sensor>

And yes, it's still simulated project. With "real" value to get closer to reality.

Thank you so much for your time and help !

@sirackerman
Copy link

sirackerman commented Oct 8, 2024

Hi @Gwenunu , I'm working on similar project like yours and as I'm also a beginner in this this, I'm having some difficulty to make it work. If possible, I'd like your guidance on how to make the robot from the simulated environment on gazebo map its environment and generate the map in either the orbslam3 viewer on in rviz. I can already run orbslam3 as a standalone library and in ros as well. I'm looking for a way to simulate my own robot as well then try the system on a real robot. I'm using an intel d435i and want to work in stereo inertial mode. Any help, link to documentation will be greatly appreciated. Thank you in advance!

@EmmanuelMess
Copy link

EmmanuelMess commented Oct 23, 2024

@Gwenunu I had problems similar to the peaks in the IMU data you show, adding noise of 0.5 stddev helped.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants