Skip to content

This repo contains the code of the paper "Continuous-Time vs. Discrete-Time Vision-based SLAM: A Comparative Study", RA-L 2022.

License

Notifications You must be signed in to change notification settings

jdluuu/rpg_vision-based_slam

 
 

Repository files navigation

Continuous- and discrete-time vision-based SLAM

Fork原因及更改内容

由于该库使用了一些当前版本环境会报错的内容,对一些函数进行了修改(主要是yaml文件和python代码,应注意,由于原代码没有实现相对路径,且本代码也没有替他实现,在使用本代码时仍然需要手动修改部分yaml文件中的路径)。

目前该代码在修改后可以正常运行EuRoC数据集上的内容,但是UZH_FPV数据集由于yaml文件 indoor_forward_3_snapdragon.yaml 没有正确的 init_t_offset_cam_gp 参数而无法运行[ISSUE]

环境版本

cuda 11.4(重要,亲测使用12.3版本会使colmap无法正确使用gpu,速度会慢特别多甚至无法运行)

python 3.8(重要,原代码部分代码风格为python2,无法正确运行)

ros noetic

ubuntu 20.04

ceres-solver 2.1.0

colmap 3.10

原README内容

Continuous- and discrete-time vision-based SLAM

Publication

If you use this code in an academic context, please cite the following RA-L 2022 paper.

G. Cioffi, T. Cieslewski, and D. Scaramuzza, "Continuous-Time vs. Discrete-Time Vision-based SLAM: A Comparative Study," IEEE Robotics and Automation Letters (RA-L). 2022.

@InProceedings{CioffiRal2022
  author = {Cioffi, Giovanni and Ciesleski, Titus and Scaramuzza, Davide},
  title = {Continuous-Time vs. Discrete-Time Vision-based SLAM: A Comparative Study},
  booktitle = {IEEE Robotics and Automation Letters (RA-L)},
  year = {2022}
}

Installation

These instructions have been tested on Ubuntu 18.04 and Ubuntu 20.04 and python 2.7 (python 3 support will come at a later point).

Prerequisites

Install Ceres Solver and COLMAP.

Build the repo

Git clone the repo:

git clone --recursive [email protected]:uzh-rpg/rpg_vision-based_slam.git

Build:

cd rpg_vision-based_slam

mkdir build

cd build

cmake .. -DCMAKE_BUILD_TYPE=Release

make -j4

Run

We provide here instructions on how to run our SLAM solution using visual, inertial, and global positional measurements. Check below for working examples on the UZH FPV dataset and EuRoC dataset.

COLMAP

As first step, we run COLMAP to get an initial camera trajectory as well as a sparse 3D map.

We provide python scripts that can be used to generate config files for COLMAP (run COLMAP from command line). Check: scripts/python/create_colmap_project_$dataset-name$.py

Extract the camera trajectory from COLMAP output:

python scripts/python/extract_traj_estimate_from_colmap_$dataset-name$.py $FLAGS$

Continuous-time SLAM

Fit the B-spline to the camera trajectory:

(from the build folder)

./fit_spline_to_colmap $CONFIG_FILE$

Initial spatial aligment (scale and pose) of the B-spline to the global frame:

python scripts/python/initialize_spline_to_global_frame_spatial_alignment.py $FLAGS$

Align spline to the global frame:

(from the build folder)

./align_spline_to_global_frame $CONFIG_FILE$

Run full-batch optimization:

(from the build folder)

./optimize_continuous_time $CONFIG_FILE$

Discrete-time SLAM

Spatial aligment (scale and pose) of the camera trajectory estimated by COLMAP to the global frame:

python scripts/python/transform_colmap_to_global_frame.py $FLAGS$

Run full-batch optimization:

(from the build folder)

./optimize_discrete_time $CONFIG_FILE$

Example: UZH-FPV dataset (Cannot run because [ISSUE])

We give here an example on how to run the continuous-time SLAM formulation on the sequence indoor forward facing 3 snapdragon of the UZH FPV dataset.

Data preparation

Create the folder rpg_vision-based_slam/datasets/UZH-FPV/indoor_forward_3_snapdragon.

Extract the content of the .zip files raw data and leica measurements in this folder.

The file datasets/UZH-FPV/calib/indoor_forward_calib_snapdragon/camchain-imucam-..indoor_forward_calib_snapdragon_imu_simple.yaml contains a simplified version (for yaml parsing) of the calibration files.

Run COLMAP

Create the COLMAP project

python scripts/python/create_colmap_project_uzhfpv_dataset.py --env=i --cam=fw --nr=3 --sens=snap --cam_i=left

This script creates config files to use in COLMAP. It will also print in the terminal the commands to execute in order to run COLMAP:

colmap database_creator --database_path $path-to-root-folder$/datasets/UZH-FPV/colmap/indoor_forward_3_snapdragon/database.db

colmap feature_extractor --project_path $path-to-root-folder$/datasets/UZH-FPV/colmap/indoor_forward_3_snapdragon/feature_extractor_config.ini

colmap sequential_matcher --project_path $path-to-root-folder$/datasets/UZH-FPV/colmap/indoor_forward_3_snapdragon/sequential_matcher_config.ini

colmap mapper --project_path $path-to-root-folder$/datasets/UZH-FPV/colmap/indoor_forward_3_snapdragon/mapper_config.ini

Visualize results using the COLMAP gui:

colmap gui

File -> Import model -> Select folder containing the model, e.g. folder 0

colmap gui project_path --database_path $path-to-/database.db$ --image_path $path-to-img-folder$

Extract COLMAP estimated trajectory:

python scripts/python/extract_traj_estimate_from_colmap_uzhfpv.py --env=i --cam=fw --nr=3 --sens=snap --cam_i=left

Prepare Leica measurements:

python scripts/python/make_leica_minimal.py --env=i --cam=fw --nr=3 --sens=snap

Run Continuous-time SLAM

mkdir -p results/UZH_FPV

cd build

./fit_spline_to_colmap ../experiments/UZH_FPV/indoor_forward_3_snapdragon/colmap_fitted_spline/indoor_forward_3_snapdragon.yaml

【注意】 UZH_FPV数据集就是这里跑不通,有人提了issue也没有修复。

python ../scripts/python/initialize_spline_to_global_frame_spatial_alignment_uzhfpv.py --config ../experiments/UZH_FPV/indoor_forward_3_snapdragon/spline_global_alignment/indoor_forward_3_snapdragon.yaml --env=i --cam=fw --nr=3 --sens=snap --gui

./align_spline_to_global_frame ../experiments/UZH_FPV/indoor_forward_3_snapdragon/spline_global_alignment/indoor_forward_3_snapdragon.yaml

./optimize_continuous_time ../experiments/UZH_FPV/indoor_forward_3_snapdragon/full_batch_optimization/continuous_time/indoor_forward_3_snapdragon.yaml

Run Discrete-time SLAM

cd build

python ../scripts/python/transform_colmap_to_global_frame.py --config ~/rpg_vision-based_slam/experiments/UZH_FPV/indoor_forward_3_snapdragon/colmap_global_alignment/indoor_forward_3_snapdragon.yaml --gui

./optimize_discrete_time ../experiments/UZH_FPV/indoor_forward_3_snapdragon/full_batch_optimization/discrete_time/indoor_forward_3_snapdragon.yaml

Plot results

Plot results of spline fitting:

python scripts/python/plot_results_spline_fitting_to_colmap_traj.py --config experiments/UZH_FPV/indoor_forward_3_snapdragon/colmap_fitted_spline/indoor_forward_3_snapdragon.yaml

Plot results of spline aligment:

python scripts/python/plot_results_spline_global_frame_alignment.py --config experiments/UZH_FPV/indoor_forward_3_snapdragon/spline_global_alignment/indoor_forward_3_snapdragon.yaml

Plot final results:

python scripts/python/plot_results_continuous_time.py --config experiments/UZH_FPV/indoor_forward_3_snapdragon/full_batch_optimization/continuous_time/indoor_forward_3_snapdragon.yaml

python scripts/python/plot_results_discrete_time.py --config experiments/UZH_FPV/indoor_forward_3_snapdragon/full_batch_optimization/discrete_time/indoor_forward_3_snapdragon.yaml

Example: EuRoC dataset

We give here an example on how to run the continuous-time and the discrete-time SLAM formulations on the sequence V2 01 easy of the EuRoC dataset.

Data preparation

Create the folder rpg_vision-based_slam/datasets/EuRoC/V2_01_easy.

Download the rosbag in this folder. Use the following script to extract the data:

【注意】,除了他说的rosbag,还需要下载ASL格式数据集包,会使用其中的ground truth csv文件。具体放的位置为rpg_vision-based_slam/datasets/EuRoC/V2_01_easy/state_groundtruth_estimate0/data.csv

python scripts/python/extract_from_euroc_rosbag.py --room=V2 --nr=1 --cam=right

The file datasets/EuRoC/calib/Vicon_room/calib.yaml contains the calibration file for this sequence.

Run COLMAP

Create the COLMAP project

python scripts/python/create_colmap_project_euroc_dataset.py --room=V2 --nr=1 --cam=right

Follow the output of the previous script to run COLMAP.

Extract COLMAP estimated trajectory:

python scripts/python/extract_traj_estimate_from_colmap_euroc.py --room=V2 --nr=1 --cam=right --colmap_model_id=0

Create global positional measurements from the ground truth:

【注意】 就是这里,如果不下载ASL格式包,不会有groundtruth文件。

python scripts/python/extract_euroc_groundtruth.py --room=V2 --nr=1

python scripts/python/make_global_measurements_euroc.py --room=V2 --nr=1 --freq=10.0 --noise=0.10

Run Continuous-time SLAM

mkdir -p results/EuRoC

cd build

./fit_spline_to_colmap ../experiments/EuRoC/V2_01_easy/colmap_fitted_spline/v2_01_easy.yaml

python ../scripts/python/initialize_spline_to_global_frame_spatial_alignment.py --config ~/rpg_vision-based_slam/experiments/EuRoC/V2_01_easy/spline_global_alignment/v2_01_easy.yaml --gui

./align_spline_to_global_frame ../experiments/EuRoC/V2_01_easy/spline_global_alignment/v2_01_easy.yaml

./optimize_continuous_time ../experiments/EuRoC/V2_01_easy/full_batch_optimization/continuous_time/v2_01_easy.yaml

Run Discrete-time SLAM

python scripts/python/transform_colmap_to_global_frame.py --config ~/rpg_vision-based_slam/experiments/EuRoC/V2_01_easy/colmap_global_alignment/v2_01_easy.yaml --gui

./optimize_discrete_time ../experiments/EuRoC/V2_01_easy/full_batch_optimization/discrete_time/v2_01_easy.yaml

Run with a sub-set of sensor modalities

We give here examples on how to run our SLAM algorithm with a sub-set of the sensor modalities.

We use the sequence V2 01 easy of the EuRoC dataset.

Global-Visual SLAM

cd build

For continuous time:

./optimize_gv_continuous_time ../experiments/EuRoC/V2_01_easy/full_batch_optimization_gv/continuous_time/v2_01_easy.yaml

For discrete time:

./optimize_gv_discrete_time ../experiments/EuRoC/V2_01_easy/full_batch_optimization_gv/discrete_time/v2_01_easy.yaml

Global-Inertial SLAM

For continuous time:

cd build

./fit_spline_to_gp_measurements ../experiments/EuRoC/V2_01_easy/fit_spline_on_gp_meas/v2_01_easy.yaml

./optimize_gi_continuous_time ../experiments/EuRoC/V2_01_easy/full_batch_optimization_gi/continuous_time/v2_01_easy.yaml

For discrete time:

./optimize_gi_discrete_time ../experiments/EuRoC/V2_01_easy/full_batch_optimization_gi/discrete_time/v2_01_easy.yaml

Visual-Inertial SLAM

The estimated trajectory by COLMAP needs to be aligned to a gravity aligned frame. This script is a good starting point to estimate gravity direction using accelerometer measurements.

For continuous time:

./optimize_vi_continuous_time ../experiments/EuRoC/V2_01_easy/full_batch_optimization_vi/continuous_time/v2_01_easy.yaml

For discrete time

./optimize_vi_discrete_time ../experiments/EuRoC/V2_01_easy/full_batch_optimization_vi/discrete_time/v2_01_easy.yaml

Others

Trajectory evaluation

Install the trajectory evaluation toolbox.

Single trajectory:

rosrun rpg_trajectory_evaluation analyze_trajectory_single.py $path-to-folder$

Multiple trajectories:

rosrun rpg_trajectory_evaluation analyze_trajectories.py $path-to-config$ --output_dir=$path$ --results_dir=$path$ --platform $value$ --odometry_error_per_dataset --plot_trajectories --rmse_table --rmse_boxplot

Credits

This repo uses some external open-source code:

Refer to each open-source code for the corresponding license.

If you note that we missed the information about the use of any other open-source code, please open an issue.

About

This repo contains the code of the paper "Continuous-Time vs. Discrete-Time Vision-based SLAM: A Comparative Study", RA-L 2022.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • C++ 66.7%
  • Python 32.4%
  • Other 0.9%