Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Main README.md for this repo #2

Open
wants to merge 1 commit into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
133 changes: 84 additions & 49 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,78 +1,113 @@
# Tsukuba Challenge Datasets

**Real World Datasets for Autonomous Navigation**
## Real World Datasets for Autonomous Navigation

## Introduction

## Tsukuba Challenge 2019 Course
Autonomous robot navigation in real-world unaltered outdoor environments is a highly complex problem, not only for the technological challenges involved in safe robot navigation, but also for the preparations and management involved.

### fuRo
Having experienced several times the arduous procedure of having a robot operating in outdoors, we know it is hardly a single person job, a team has to deal with several tasks including: a priori preparations of equipment (at least charging several sets of batteries), transportation or logistics to the designated operation grounds, assembly of robot and operations base, continuously monitoring for overheating of robot and human operators, difficult visibility of displays under strong sun-light, dealing with changing weather conditions, etc. We know what is involved and anyway have to face that every time we find the need of data to test our algorithms or validate our research.

#### Sensor Data
In this repository we attempt to address this problem and aim to provide a collection of several datasets captured by multiple teams during their participation in **Tsukuba Challenge** -- **real world robot challenge**, held every year since 2007 in Tsukuba City, Japan. Several teams attend yearly such robotics challenges and demonstrations, the accumulated set of perception experiences, the data, captured in real-world conditions, in environments not modified to suit the robot behaviour, under diverse weather and traffic conditions, is of immense value, and sharing it is how we can improve the state-of-the-art in autonomous outdoor robot navigation.

- **Short Name:** tc19_furo
- **File:** [tc19_js_2019-09-14-14-12-15.bag](https://)
- **Size:** 55.8 GB
- **Format:** rosbag
- **Date:** 2019-09-14 14:12:15
- **Duration:** 1hr 47:31s
- **Setup:** Mobile Robot (Joystick Operation)
- **Sensors:**
- **Lidar:** SureStar R-Fans-16M
- **Camera:** No
- **Radar:** No
- **GNSS:** No
- **IMU:** Xsens MTi-3
- **Wheel Encoders (Odometry):** Yes
- **Description:** Low cost 3D-Lidar, Less accurate wheel odometry.
- **License:** TBD
We welcome your contributions to this ambitious repository of real-world autonomous robotic navigation datasets.

If you use our dataset in your academic work, please cite the following paper [[DOI](https://doi.org/10.1080/01691864.2020.1717375)]:
```
Yoshitaka Hara and Masahiro Tomono: "Moving Object Removal and Surface Mesh Mapping for Path Planning on 3D Terrain", Advanced Robotics, vol. 34, no. 6, pp. 375--387, 2020.
```
##How to contribute
Because this initiative is from an open community of roboticists without any interest in profit, this GitHub organization is limited and thus we don't have enough resources to hold the actual dataset files stored here. Instead, we kindly ask your cooperation to keep your contributed dataset in your cloud storage, and share the link to those shared files.

### Procedure
We will keep all the contributed datasets under the `datasets` folder.

#### Map Data
You can clone this repository but the preferred approach is for you to fork it into your own GitHub account, then create a branch to hold your changes, and then create a folder inside `datasets` with a descriptive name (if you are contributing multiple datasets, please create additional sub-folders). Inside your new folder please add a `README.md` file with the description of the dataset, including the links to the shared resources, the name of the contact persons, the license type, etc. A template with the basic contents of this `README.md` file is provided in the `datasets/template` folder. Finally, please commit your changes, and then issue a pull request so we include your contribution. We will also keep an index of the contributed datasets in this repository for the convenience of all users.

- **Short Name:** map_tc19_furo
- **File:** [map_tc19_o085_f-04_t05.pcd](https://)
- **Size:** 683 MB
- **Format:** pcd
- **Number of Points:** 22,356,688
- **Point Type:**
- **XYZ:** Yes
- **Intensity:** Yes
- **Color:** No
- **Normal:** Yes
- **SLAM Method:** Occupancy Voxel Mapping using 3D Cartographer
- **Description:** Moving objects has been removed.
- **License:** TBD
### Example
Follow these steps when contributing a new dataset:

If you use our dataset in your academic work, please cite the following paper [[DOI](https://doi.org/10.1080/01691864.2020.1717375)]:
1\. Clone this repository (if you forked the repo, please see the instructions below):
```
Yoshitaka Hara and Masahiro Tomono: "Moving Object Removal and Surface Mesh Mapping for Path Planning on 3D Terrain", Advanced Robotics, vol. 34, no. 6, pp. 375--387, 2020.
$ git clone https://github.com/tsukubachallenge/datasets.git
```
Follow these steps if you are using your fork, if not skip to step 2.

First, let's create the fork by clicking the **Fork** button:

## Tsukuba Challenge 2018 Course
![Fork](docs/fork.png)

### [WIP] fuRo

It will take just a few seconds

## Other Courses
Press the **Clone** or **Code** button to copy the fork URL:

### [WIP] fuRo Tsudanuma
![Clone](docs/clone-from-fork.png)

Then clone from your own fork:
```
$ git clone https://github.com/<YOUR ACCOUNT>/datasets.git
```
Then, add the corresponding remote upstream repository:
```
$ git remote add upstream https://github.com/tsukubachallenge/datasets.git
```
And let's verify your remote repositories locations:
```
$ git remote -v
origin https://github.com/<YOUR ACCOUNT>/datasets.git (fetch)
origin https://github.com/<YOUR ACCOUNT>/datasets.git (push)
upstream https://github.com/tsukubachallenge/datasets.git (fetch)
upstream https://github.com/tsukubachallenge/datasets.git (push)
```
Now that the upstream repository is set, let's update your branch with the upstream repository's master branch:
```
$ git checkout master
$ git fetch upstream
$ git merge upstream/master
```
![Example](docs/example1.png)

## Example Course Template
2\. Let's create your branch:
```
$ cd datasets
$ git checkout -b <YOUR-BRANCH-NAME>
```
The branch name can be, for example, the name of your dataset.

### Team (and Course) Name in English
3\. Let's add your dataset folder:
```
$ mkdir datasets/<YOUR-DATASET-NAME>
```
4\. Copy from the template sub-folder the README.md template and fill it:

```
$ cp datasets/template/README.md datasets/<YOUR-DATASET-NAME>
$ cd datasets/<YOUR-DATASET-NAME>
$ <YOUR-FAVOURITE-TEXT-EDITOR> README.md
```

#### Sensor Data (optional)
5\. Add and commit your changes:
```
$ git add ../datasets/<YOUR-DATASET-NAME>
$ git commit -m <PLEASE WRITE A COMMENT OF YOUR CHANGES>
```
Remember that commit only saves your changes locally, the GitHub server is not used.

6\. If you are ready to push your branch for the first time:
```
$ git push --set-upstream origin <YOUR-BRANCH-NAME>
```
This will push your changes into your branch in GitHub. If you forked, the branch is pushed into your own fork.

If you have already defined the upstream, then simply do:
```
$ git push
```
you can commit and push as many times as you need.

#### Map Data (optional)
7\. Once you are finished, please issue a pull request you we can merge your changes.

* Go to the datasets GitHub repository at [https://github.com/tsukubachallenge/datasets](https://github.com/tsukubachallenge/datasets)
* Check the branches pulldown menu and select your branch
* GitHub will inform you there are changes
* Select the Pull Request option

# [WIP] Related Datasets
## Template for your dataset
You can find a template for your dataset README.md file together with one sample file inside the `datasets/template` folder.
78 changes: 78 additions & 0 deletions datasets/fuRo/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,78 @@
# Tsukuba Challenge Datasets

**Real World Datasets for Autonomous Navigation**


## Tsukuba Challenge 2019 Course

### fuRo

#### Sensor Data

- **Short Name:** tc19_furo
- **File:** [tc19_js_2019-09-14-14-12-15.bag](https://)
- **Size:** 55.8 GB
- **Format:** rosbag
- **Date:** 2019-09-14 14:12:15
- **Duration:** 1hr 47:31s
- **Setup:** Mobile Robot (Joystick Operation)
- **Sensors:**
- **Lidar:** SureStar R-Fans-16M
- **Camera:** No
- **Radar:** No
- **GNSS:** No
- **IMU:** Xsens MTi-3
- **Wheel Encoders (Odometry):** Yes
- **Description:** Low cost 3D-Lidar, Less accurate wheel odometry.
- **License:** TBD

If you use our dataset in your academic work, please cite the following paper [[DOI](https://doi.org/10.1080/01691864.2020.1717375)]:
```
Yoshitaka Hara and Masahiro Tomono: "Moving Object Removal and Surface Mesh Mapping for Path Planning on 3D Terrain", Advanced Robotics, vol. 34, no. 6, pp. 375--387, 2020.
```


#### Map Data

- **Short Name:** map_tc19_furo
- **File:** [map_tc19_o085_f-04_t05.pcd](https://)
- **Size:** 683 MB
- **Format:** pcd
- **Number of Points:** 22,356,688
- **Point Type:**
- **XYZ:** Yes
- **Intensity:** Yes
- **Color:** No
- **Normal:** Yes
- **SLAM Method:** Occupancy Voxel Mapping using 3D Cartographer
- **Description:** Moving objects has been removed.
- **License:** TBD

If you use our dataset in your academic work, please cite the following paper [[DOI](https://doi.org/10.1080/01691864.2020.1717375)]:
```
Yoshitaka Hara and Masahiro Tomono: "Moving Object Removal and Surface Mesh Mapping for Path Planning on 3D Terrain", Advanced Robotics, vol. 34, no. 6, pp. 375--387, 2020.
```


## Tsukuba Challenge 2018 Course

### [WIP] fuRo


## Other Courses

### [WIP] fuRo Tsudanuma


## Example Course Template

### Team (and Course) Name in English


#### Sensor Data (optional)


#### Map Data (optional)


# [WIP] Related Datasets
80 changes: 80 additions & 0 deletions datasets/template/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,80 @@
# `<YOUR DATASET NAME>`

Give a significative title to your dataset.


## Authors
Please provide the list of authors and contact information.

## Description
Please provide a general description of your dataset, you may add images and other resources to make your README.md file more descriptive.

## Copyright/License
PLEASE INDICATE WHO OWNS THE COPY RIGHTS AND/OR WHAT IS THE LICENSE MODEL OF YOUR DATASET. SOME STANDARD LICENSES ARE: **GLP 3, BSD, MIT, Apache 2.0, CC BY-NC-SA 4.0**, etc.

## Citation
PLEASE PROVIDE THE BIBTEX OR PLAIN CITATION OF ANY OF YOUR PAPERS RELATED TO THIS DATASET.

### Sensor Data
Please describe the sensor data in your dataset, as follows:

- **Short Name:** `<SIMPLE NAME>`
- **Description:** `<BRIEF DESCRIPTION OF THE FILE(S) AND HOW DATA WAS CAPTURED>`
- **Location:** `<WHERE WAS THIS DATA CAPTURED>`
- **File:** `<LINK TO THE FILE(S) IN MARKDOWN FORMAT [LINK NAME](URL)>`
- **Size:** `<SIZE OF YOUR DATASET>`
- **Format:** `<DISTRIBUTION FORMAT: ROSBAG/KITTI/CSV/ETC>`
- **Date:** `<DATE/TIME WHEN YOU CREATED THIS DATASET>`
- **Duration:** `<TOTAL TIME>`
- **Sensors:** `<PLEASE DESCRIBE THE SENSORS IN THIS DATASET, HERE ARE A FEW COMMON SENSORS>`
- **3D LiDAR:** `<MODEL NAME AND DATATYPE, NO IF NOT AVAILABLE>`
- **2D LRF:** `<MODEL NAME AND DATATYPE, NO IF NOT AVAILABLE>`
- **RGB Camera:** `<MODEL NAME, LENS TYPE AND DATATYPE, NO IF NOT AVAILABLE>`
- **IR Camera:** `<IR/THERMO CAMERA MODEL AND DATATYPE, NO IF NOT AVAILABLE>`
- **RGB-D/Flash Camera:** `<MODEL NAME AND DATATYPE, NO IF NOT AVAILABLE>`
- **Event/DVS Camera:** `<MODEL NAME AND DATATYPE, NO IF NOT AVAILABLE>`
- **Radar:** `<MODEL NAME AND DATATYPE, NO IF NOT AVAILABLE>`
- **IMU:** `<MODEL NAME AND DATATYPE, NO IF NOT AVAILABLE>`
- **GNSS:** `<MODEL NAME AND DATATYPE, NO IF NOT AVAILABLE>`
- **Wheel Encoders (Odometry):** `<DESCRIPTION AND DATATYPE, NO IF NOT AVAILABLE>`
- **Ultrasonic:** `<MODEL NAME AND DATATYPE, NO IF NOT AVAILABLE>`
- **Microphone:** `<DESCRIPTION AND DATATYPE, NO IF NOT AVAILABLE>`
- **Temperature:** `<MODEL NAME AND DATATYPE, NO IF NOT AVAILABLE>`
- **Others:** `<DESCRIPTION AND DATATYPE, NO IF NOT AVAILABLE>`

FEEL FREE TO ADD MORE SENSORS TYPES. IF YOU HAVE MORE THAN ONE SENSOR OF THE SAME TYPE PLEASE LIST THEM.


### Coordinate System
- **Units:** Please state the distance measurement units (meters, centimeters, milimeters, inches), and rotation units (degrees, radians)
- **Robot:** Please provide the robot dimensions [width, lenght, height]
- **Baselink/centert:** Please provide the coordinates of the baselink [X, Y, Z] relative to the robot body.
- **Sensors:** For each sensor, Please describe the relative pose and posture [X, Y, Z, Yaw, Pitch, Roll] of all the sensors with respect to the robot center or baselink.

You can also contribute the extrinsics information via calibration files. The intrisics calibration files of the cameras are also appretiated.

### Map Data
If you are contributing also the map(s), please describe it as follows:

- **Short Name:** `<SIMPLE NAME>`
- **Description:** `<BRIEF DESCRIPTION OF THE MAP, WHAT IT INCLUDES, ISSUES, ETC.>`
- **File:** `<LINK TO THE MAP FILE(S) IN MARKDOWN FORMAT [LINK NAME](URL)>`
- **Size:** `<SIZE OF THE MAP>`
- **Format:** `<FORMAT OF THE MAP: ASCII PCD/BINARY PCD/PLY/VTK/OBJ/3DS/XML/PNG/PGM/YAML>`
- **Date:** `<DATE/TIME WHEN YOU CREATED THIS MAP>`
- **Number of Points:** `<NUMBER OF POINTS IN THE MAP>`
- **Point Type:** `<PLEASE DESCRIBE THE MAP POINTS>`
- **XY:** `<ONLY 2D COORDINATES INCLUDED>`
- **XYZ:** `<3D COORDINATES INCLUDED>`
- **Intensity:** `<LIDAR INTENSITY INFORMATION INCLUDED>`
- **RGB:** `<CAMERA COLOR INFORMATION INCLUDED>`
- **Normal:** `<NORMAL VECTOR INCLUDED>`
- **Other:** `<PLEASE DESCRIBE FIELD TYPES>`
- **SLAM Method:** `<WHICH SLAM METHOD WAS USED TO CREATE THE MAP: NDT2D/NDT3D/Cartographer/GMapping/Octomap/LOAM/OpenVSLAM/Litamin/SuMa++/ETC.>`
- **GNSS:** `<WHETHER COORDINATES USE GNSS OR NOT>`
- **OTHER INFO:** `<PLEASE PROVIDE OTHER INFORMATION NECESSARY>`

In case the map is split into several submaps, please provide instead the tar/tar.gz/tar.bz2/zip/7z/etc.

### Annotation
If your dataset has annotations (labels) corresponding to ground truth, please provide a description of the labels.
Loading