diff --git a/README.md b/README.md index 2c45cb1..c1d2ede 100644 --- a/README.md +++ b/README.md @@ -1,78 +1,113 @@ # Tsukuba Challenge Datasets -**Real World Datasets for Autonomous Navigation** +## Real World Datasets for Autonomous Navigation +## Introduction -## Tsukuba Challenge 2019 Course +Autonomous robot navigation in real-world unaltered outdoor environments is a highly complex problem, not only for the technological challenges involved in safe robot navigation, but also for the preparations and management involved. -### fuRo +Having experienced several times the arduous procedure of having a robot operating in outdoors, we know it is hardly a single person job, a team has to deal with several tasks including: a priori preparations of equipment (at least charging several sets of batteries), transportation or logistics to the designated operation grounds, assembly of robot and operations base, continuously monitoring for overheating of robot and human operators, difficult visibility of displays under strong sun-light, dealing with changing weather conditions, etc. We know what is involved and anyway have to face that every time we find the need of data to test our algorithms or validate our research. -#### Sensor Data +In this repository we attempt to address this problem and aim to provide a collection of several datasets captured by multiple teams during their participation in **Tsukuba Challenge** -- **real world robot challenge**, held every year since 2007 in Tsukuba City, Japan. Several teams attend yearly such robotics challenges and demonstrations, the accumulated set of perception experiences, the data, captured in real-world conditions, in environments not modified to suit the robot behaviour, under diverse weather and traffic conditions, is of immense value, and sharing it is how we can improve the state-of-the-art in autonomous outdoor robot navigation. -- **Short Name:** tc19_furo -- **File:** [tc19_js_2019-09-14-14-12-15.bag](https://) -- **Size:** 55.8 GB -- **Format:** rosbag -- **Date:** 2019-09-14 14:12:15 -- **Duration:** 1hr 47:31s -- **Setup:** Mobile Robot (Joystick Operation) -- **Sensors:** - - **Lidar:** SureStar R-Fans-16M - - **Camera:** No - - **Radar:** No - - **GNSS:** No - - **IMU:** Xsens MTi-3 - - **Wheel Encoders (Odometry):** Yes -- **Description:** Low cost 3D-Lidar, Less accurate wheel odometry. -- **License:** TBD +We welcome your contributions to this ambitious repository of real-world autonomous robotic navigation datasets. -If you use our dataset in your academic work, please cite the following paper [[DOI](https://doi.org/10.1080/01691864.2020.1717375)]: -``` -Yoshitaka Hara and Masahiro Tomono: "Moving Object Removal and Surface Mesh Mapping for Path Planning on 3D Terrain", Advanced Robotics, vol. 34, no. 6, pp. 375--387, 2020. -``` +##How to contribute +Because this initiative is from an open community of roboticists without any interest in profit, this GitHub organization is limited and thus we don't have enough resources to hold the actual dataset files stored here. Instead, we kindly ask your cooperation to keep your contributed dataset in your cloud storage, and share the link to those shared files. +### Procedure +We will keep all the contributed datasets under the `datasets` folder. -#### Map Data +You can clone this repository but the preferred approach is for you to fork it into your own GitHub account, then create a branch to hold your changes, and then create a folder inside `datasets` with a descriptive name (if you are contributing multiple datasets, please create additional sub-folders). Inside your new folder please add a `README.md` file with the description of the dataset, including the links to the shared resources, the name of the contact persons, the license type, etc. A template with the basic contents of this `README.md` file is provided in the `datasets/template` folder. Finally, please commit your changes, and then issue a pull request so we include your contribution. We will also keep an index of the contributed datasets in this repository for the convenience of all users. -- **Short Name:** map_tc19_furo -- **File:** [map_tc19_o085_f-04_t05.pcd](https://) -- **Size:** 683 MB -- **Format:** pcd -- **Number of Points:** 22,356,688 -- **Point Type:** - - **XYZ:** Yes - - **Intensity:** Yes - - **Color:** No - - **Normal:** Yes -- **SLAM Method:** Occupancy Voxel Mapping using 3D Cartographer -- **Description:** Moving objects has been removed. -- **License:** TBD +### Example +Follow these steps when contributing a new dataset: -If you use our dataset in your academic work, please cite the following paper [[DOI](https://doi.org/10.1080/01691864.2020.1717375)]: +1\. Clone this repository (if you forked the repo, please see the instructions below): ``` -Yoshitaka Hara and Masahiro Tomono: "Moving Object Removal and Surface Mesh Mapping for Path Planning on 3D Terrain", Advanced Robotics, vol. 34, no. 6, pp. 375--387, 2020. +$ git clone https://github.com/tsukubachallenge/datasets.git ``` +Follow these steps if you are using your fork, if not skip to step 2. +First, let's create the fork by clicking the **Fork** button: -## Tsukuba Challenge 2018 Course +![Fork](docs/fork.png) -### [WIP] fuRo +It will take just a few seconds -## Other Courses +Press the **Clone** or **Code** button to copy the fork URL: -### [WIP] fuRo Tsudanuma +![Clone](docs/clone-from-fork.png) +Then clone from your own fork: +``` +$ git clone https://github.com//datasets.git +``` +Then, add the corresponding remote upstream repository: +``` +$ git remote add upstream https://github.com/tsukubachallenge/datasets.git +``` +And let's verify your remote repositories locations: +``` +$ git remote -v +origin https://github.com//datasets.git (fetch) +origin https://github.com//datasets.git (push) +upstream https://github.com/tsukubachallenge/datasets.git (fetch) +upstream https://github.com/tsukubachallenge/datasets.git (push) +``` +Now that the upstream repository is set, let's update your branch with the upstream repository's master branch: +``` +$ git checkout master +$ git fetch upstream +$ git merge upstream/master +``` +![Example](docs/example1.png) -## Example Course Template +2\. Let's create your branch: +``` +$ cd datasets +$ git checkout -b +``` +The branch name can be, for example, the name of your dataset. -### Team (and Course) Name in English +3\. Let's add your dataset folder: +``` +$ mkdir datasets/ +``` +4\. Copy from the template sub-folder the README.md template and fill it: +``` +$ cp datasets/template/README.md datasets/ +$ cd datasets/ +$ README.md +``` -#### Sensor Data (optional) +5\. Add and commit your changes: +``` +$ git add ../datasets/ +$ git commit -m +``` +Remember that commit only saves your changes locally, the GitHub server is not used. +6\. If you are ready to push your branch for the first time: +``` +$ git push --set-upstream origin +``` +This will push your changes into your branch in GitHub. If you forked, the branch is pushed into your own fork. + +If you have already defined the upstream, then simply do: +``` +$ git push +``` +you can commit and push as many times as you need. -#### Map Data (optional) +7\. Once you are finished, please issue a pull request you we can merge your changes. +* Go to the datasets GitHub repository at [https://github.com/tsukubachallenge/datasets](https://github.com/tsukubachallenge/datasets) +* Check the branches pulldown menu and select your branch +* GitHub will inform you there are changes +* Select the Pull Request option -# [WIP] Related Datasets +## Template for your dataset +You can find a template for your dataset README.md file together with one sample file inside the `datasets/template` folder. diff --git a/datasets/fuRo/README.md b/datasets/fuRo/README.md new file mode 100644 index 0000000..2c45cb1 --- /dev/null +++ b/datasets/fuRo/README.md @@ -0,0 +1,78 @@ +# Tsukuba Challenge Datasets + +**Real World Datasets for Autonomous Navigation** + + +## Tsukuba Challenge 2019 Course + +### fuRo + +#### Sensor Data + +- **Short Name:** tc19_furo +- **File:** [tc19_js_2019-09-14-14-12-15.bag](https://) +- **Size:** 55.8 GB +- **Format:** rosbag +- **Date:** 2019-09-14 14:12:15 +- **Duration:** 1hr 47:31s +- **Setup:** Mobile Robot (Joystick Operation) +- **Sensors:** + - **Lidar:** SureStar R-Fans-16M + - **Camera:** No + - **Radar:** No + - **GNSS:** No + - **IMU:** Xsens MTi-3 + - **Wheel Encoders (Odometry):** Yes +- **Description:** Low cost 3D-Lidar, Less accurate wheel odometry. +- **License:** TBD + +If you use our dataset in your academic work, please cite the following paper [[DOI](https://doi.org/10.1080/01691864.2020.1717375)]: +``` +Yoshitaka Hara and Masahiro Tomono: "Moving Object Removal and Surface Mesh Mapping for Path Planning on 3D Terrain", Advanced Robotics, vol. 34, no. 6, pp. 375--387, 2020. +``` + + +#### Map Data + +- **Short Name:** map_tc19_furo +- **File:** [map_tc19_o085_f-04_t05.pcd](https://) +- **Size:** 683 MB +- **Format:** pcd +- **Number of Points:** 22,356,688 +- **Point Type:** + - **XYZ:** Yes + - **Intensity:** Yes + - **Color:** No + - **Normal:** Yes +- **SLAM Method:** Occupancy Voxel Mapping using 3D Cartographer +- **Description:** Moving objects has been removed. +- **License:** TBD + +If you use our dataset in your academic work, please cite the following paper [[DOI](https://doi.org/10.1080/01691864.2020.1717375)]: +``` +Yoshitaka Hara and Masahiro Tomono: "Moving Object Removal and Surface Mesh Mapping for Path Planning on 3D Terrain", Advanced Robotics, vol. 34, no. 6, pp. 375--387, 2020. +``` + + +## Tsukuba Challenge 2018 Course + +### [WIP] fuRo + + +## Other Courses + +### [WIP] fuRo Tsudanuma + + +## Example Course Template + +### Team (and Course) Name in English + + +#### Sensor Data (optional) + + +#### Map Data (optional) + + +# [WIP] Related Datasets diff --git a/datasets/template/README.md b/datasets/template/README.md new file mode 100644 index 0000000..23a30b3 --- /dev/null +++ b/datasets/template/README.md @@ -0,0 +1,80 @@ +# `` + +Give a significative title to your dataset. + + +## Authors +Please provide the list of authors and contact information. + +## Description +Please provide a general description of your dataset, you may add images and other resources to make your README.md file more descriptive. + +## Copyright/License +PLEASE INDICATE WHO OWNS THE COPY RIGHTS AND/OR WHAT IS THE LICENSE MODEL OF YOUR DATASET. SOME STANDARD LICENSES ARE: **GLP 3, BSD, MIT, Apache 2.0, CC BY-NC-SA 4.0**, etc. + +## Citation +PLEASE PROVIDE THE BIBTEX OR PLAIN CITATION OF ANY OF YOUR PAPERS RELATED TO THIS DATASET. + +### Sensor Data +Please describe the sensor data in your dataset, as follows: + +- **Short Name:** `` +- **Description:** `` +- **Location:** `` +- **File:** `` +- **Size:** `` +- **Format:** `` +- **Date:** `` +- **Duration:** `` +- **Sensors:** `` + - **3D LiDAR:** `` + - **2D LRF:** `` + - **RGB Camera:** `` + - **IR Camera:** `` + - **RGB-D/Flash Camera:** `` + - **Event/DVS Camera:** `` + - **Radar:** `` + - **IMU:** `` + - **GNSS:** `` + - **Wheel Encoders (Odometry):** `` + - **Ultrasonic:** `` + - **Microphone:** `` + - **Temperature:** `` + - **Others:** `` + +FEEL FREE TO ADD MORE SENSORS TYPES. IF YOU HAVE MORE THAN ONE SENSOR OF THE SAME TYPE PLEASE LIST THEM. + + +### Coordinate System +- **Units:** Please state the distance measurement units (meters, centimeters, milimeters, inches), and rotation units (degrees, radians) +- **Robot:** Please provide the robot dimensions [width, lenght, height] +- **Baselink/centert:** Please provide the coordinates of the baselink [X, Y, Z] relative to the robot body. +- **Sensors:** For each sensor, Please describe the relative pose and posture [X, Y, Z, Yaw, Pitch, Roll] of all the sensors with respect to the robot center or baselink. + +You can also contribute the extrinsics information via calibration files. The intrisics calibration files of the cameras are also appretiated. + +### Map Data +If you are contributing also the map(s), please describe it as follows: + +- **Short Name:** `` +- **Description:** `` +- **File:** `` +- **Size:** `` +- **Format:** `` +- **Date:** `` +- **Number of Points:** `` +- **Point Type:** `` + - **XY:** `` + - **XYZ:** `<3D COORDINATES INCLUDED>` + - **Intensity:** `` + - **RGB:** `` + - **Normal:** `` + - **Other:** `` +- **SLAM Method:** `` +- **GNSS:** `` +- **OTHER INFO:** `` + +In case the map is split into several submaps, please provide instead the tar/tar.gz/tar.bz2/zip/7z/etc. + +### Annotation +If your dataset has annotations (labels) corresponding to ground truth, please provide a description of the labels. diff --git a/datasets/template/sample-README.md b/datasets/template/sample-README.md new file mode 100644 index 0000000..d641ef2 --- /dev/null +++ b/datasets/template/sample-README.md @@ -0,0 +1,99 @@ +# TC2018-ABC-TEAM + +## Authors +Dennis Ritchie + +Linus Torvalds + +Guido van Rossum + +Eric Berger + +## Description +This is our dataset captured during Tsukuba Challenge 2018. It consists of 2km long trajectory with several pedestrians and different illumination conditions. + +## Copyright/License +Copyright 2015-2019 ABC TEAM, licensed under Apache 2.0 + +## Citation +``` +@INPROCEEDINGS { linux-python-ros, + author = {Ritchie, Dennis and Torvalds, Linux and van Rossum, Guido and Berger, Eric}, + title = { We are the authors of {C}, {Linux}, {Python} and {ROS} }, + booktitle = { Proceedings of Some Important Conference }, + year = { 2020 }, + month = { August } +} +``` + +### Sensor Data +Please describe the sensor data in your dataset, as follows: + +- **Short Name:** TC2018-ABC-TEAM +- **Description:** This file contents our captured data during Tsukuba Challenge 2018 +- **Location:** Around Tsukuba City Hall +- **File:** [tc2018-abc-team.bag](https://) +- **Size:** 20 GB +- **Format:** ROSBAG +- **Date:** 2018/11/11 10:30am +- **Duration:** 1h22min +- **Sensors:** + - **3D LiDAR:** VELODYNE VLP16, sensor_msgs/PointCloud2, velodyne_msgs/VelodyneScan + - **3D LiDAR:** VELODYNE VLP32, sensor_msgs/PointCloud2, velodyne_msgs/VelodyneScan + - **2D LRF:** Hokuyo UST-20LX, sensor_msgs/LaserScan + - **RGB Camera:** FLIR Grasshopper, sensor_msgs/Image + - **IR Camera:** FLIR ADK, sensor_msgs/Image + - **RGB-D/Flash Camera:** Microsoft Kinect2, sensor_msgs/PointCloud2, sensor_msgs/Image + - **Event/DVS Camera:** No + - **Radar:** No + - **IMU:** XSens MTi-300, sensor_msgs/Imu + - **GNSS:** No + - **Wheel Encoders (Odometry):** nav_msgs/Odometry + - **Ultrasonic:** No + - **Microphone:** No + - **Temperature:** No + - **Others:** Joystick commands, sensor_msgs/Joy + + +### Coordinate System +- **Units:** all distances are expressed in meters, all rotations in degrees. +- **Robot:** Robot dimensions [width, lenght, height] = [0.6, 0.4, 0.6] +- **Baselink/centert:** Baselink at [0.3, 0.2, 0.15] +- **Sensors:** + * VLP16 to baselink [X, Y, Z, Yaw, Pitch, Roll] = [0, 0, 0.5, 0, 0, 0] + * VLP32 to baselink [X, Y, Z, Yaw, Pitch, Roll] = [0.3, 0, 0.3, 0, 0, 0] + * UST-20LX to baselink [X, Y, Z, Yaw, Pitch, Roll] = [0.3, 0, 0, 0, 0, 0] + * Grasshopper to baselink [X, Y, Z, Yaw, Pitch, Roll] = [0, 0, 0.6, 0, 0, 0] + * ADK to baselink [X, Y, Z, Yaw, Pitch, Roll] = [0, 0, 0.4, 0, 0, 0] + * Kinect2 to baselink [X, Y, Z, Yaw, Pitch, Roll] = [0, 0, 0.5, 0, -10, 0] + * MTi-300 to baselink [X, Y, Z, Yaw, Pitch, Roll] = [0, 0, 0, 0, 0, 0] + +Grasshopper calibration file is at [grasshopper.yaml](calib/grasshopper.yaml) + +VLP16 to Grasshopper extrinsics calibration file is at [vlp16-grasshopper.yaml](calib/vlp16-grasshopper.yaml) + +VLP32 to Grasshopper extrinsics calibration file is at [vlp32-grasshopper.yaml](calib/vlp32-grasshopper.yaml) + + +### Map Data + +- **Short Name:** map_tc2018-abc-team +- **Description:** Map of TC2018 course, includes moving objects +- **File:** [map_tc2018-abc-team.pcd](https://) +- **Size:** 500 MB +- **Format:** pcd +- **Date:** 2018/08/05 15:30am +- **Number of Points:** 30,000,000 +- **Point Type:** + - **XYZ:** Yes + - **Intensity:** Yes + - **RGB:** No + - **Normal:** No + - **Other:** No +- **SLAM Method:** NDT 3D +- **GNSS:** No +- **OTHER INFO:** No + + +### Annotation +No annotation. diff --git a/docs/clone-from-fork.png b/docs/clone-from-fork.png new file mode 100644 index 0000000..6f46474 Binary files /dev/null and b/docs/clone-from-fork.png differ diff --git a/docs/example1.png b/docs/example1.png new file mode 100644 index 0000000..f335a42 Binary files /dev/null and b/docs/example1.png differ diff --git a/docs/fork.png b/docs/fork.png new file mode 100644 index 0000000..d2795fc Binary files /dev/null and b/docs/fork.png differ