From 851db08347849a94c51d3ebc3a36df8a114b800f Mon Sep 17 00:00:00 2001 From: "Juan D. Tardos" Date: Wed, 22 Dec 2021 13:41:59 +0100 Subject: [PATCH] Update README.md --- README.md | 28 ++++++++++++++++++++++------ 1 file changed, 22 insertions(+), 6 deletions(-) diff --git a/README.md b/README.md index 4c34722f57..0ca3a9f2d3 100644 --- a/README.md +++ b/README.md @@ -38,7 +38,7 @@ For a closed-source version of ORB-SLAM3 for commercial purposes, please contact If you use ORB-SLAM3 in an academic work, please cite: - @article{ORBSLAM3_2020, + @article{ORBSLAM3_TRO, title={{ORB-SLAM3}: An Accurate Open-Source Library for Visual, Visual-Inertial and Multi-Map {SLAM}}, author={Campos, Carlos AND Elvira, Richard AND G\´omez, Juan J. AND Montiel, @@ -95,7 +95,23 @@ chmod +x build.sh This will create **libORB_SLAM3.so** at *lib* folder and the executables in *Examples* folder. -# 4. EuRoC Examples +# 4. Running ORB-SLAM3 with your camera + +Directory `Examples` contains several demo programs and calibration files to run ORB-SLAM3 in all sensor configurations with Intel Realsense cameras T265 and D435i. The steps needed to use your own camera are: + +1. Calibrate your camera following `Calibration_Tutorial.pdf` and write your calibration file `your_camera.yaml` + +2. Modify one of the provided demos to suit your specific camera model, and build it + +3. Connect the camera to your computer using USB3 or the appropriate interface + +4. Run ORB-SLAM3. For example, for our D435i camera, we would execute: + +``` +./Examples/Stereo-Inertial/stereo_inertial_realsense_D435i Vocabulary/ORBvoc.txt ./Examples/Stereo-Inertial/RealSense_D435i.yaml +``` + +# 5. EuRoC Examples [EuRoC dataset](http://projects.asl.ethz.ch/datasets/doku.php?id=kmavvisualinertialdatasets) was recorded with two pinhole cameras and an inertial sensor. We provide an example script to launch EuRoC sequences in all the sensor configurations. 1. Download a sequence (ASL format) from http://projects.asl.ethz.ch/datasets/doku.php?id=kmavvisualinertialdatasets @@ -115,7 +131,7 @@ Execute the following script to process sequences and compute the RMS ATE: ./euroc_eval_examples ``` -# 5. TUM-VI Examples +# 6. TUM-VI Examples [TUM-VI dataset](https://vision.in.tum.de/data/datasets/visual-inertial-dataset) was recorded with two fisheye cameras and an inertial sensor. 1. Download a sequence from https://vision.in.tum.de/data/datasets/visual-inertial-dataset and uncompress it. @@ -135,7 +151,7 @@ Execute the following script to process sequences and compute the RMS ATE: ./tum_vi_eval_examples ``` -# 6. ROS Examples +# 7. ROS Examples ### Building the nodes for mono, mono-inertial, stereo, stereo-inertial and RGB-D Tested with ROS Melodic and ubuntu 18.04. @@ -212,8 +228,8 @@ Once ORB-SLAM3 has loaded the vocabulary, press space in the rosbag tab. rosrun rosbag fastrebag.py dataset-room1_512_16.bag dataset-room1_512_16_small_chunks.bag ``` -# 7. Running time analysis +# 8. Running time analysis A flag in `include\Config.h` activates time measurements. It is necessary to uncomment the line `#define REGISTER_TIMES` to obtain the time stats of one execution which is shown at the terminal and stored in a text file(`ExecTimeMean.txt`). -# 8. Calibration +# 9. Calibration You can find a tutorial for visual-inertial calibration and a detailed description of the contents of valid configuration files at `Calibration_Tutorial.pdf`