Active3DGym is a set of benchmark environments for the active view planning problem in robotics.
git clone https://github.com/kevin-thankyou-lin/active-3d-gym
cd active-3d-gym
pip install -e .
Then, to use the provided environments in a specific file:
import gym
import active_3d
env = gym.make("OfflineActive3D-v0", data_dir=<path/to/data_dir>)
We assume data_dir
's folder structure is as follows:
data_dir/
- transform_train.json
- transform_val.json
- object_bounds.json # mightn't be necessary: include object bound information?
- images/
im_0.png
im_0_depth.exr
im_0_distance.exr
im_1.png
im_1_depth.exr
im_1_distance.exr
...
transform_train.json
(and, optionally, transform_val.json
) should have file structure compatible with nerf data, such as:
{
"fl_x": ...,
"fl_y": ...,
"c_x": ...,
"c_y": ...,
"aabb_scale": null,
"frames": [
{
"file_path": "train/im_0.png",
"depth_path": "train/im_0_depth.exr",
"distance_path": "train/im_0_distance.exr",
"transform_matrix": [...]
},
...
]
To generate your own offline dataset, follow the instructions detailed below at Generating offline data
.
If a pixel has an invalid depth values or if the depth value is infinity, the depth map (and distance map) value should be set to 0 as convention.
Helpful note from Blenderproc docs: "While distance and depth images sound similar, they are not the same: In distance images, each pixel contains the actual distance from the camera position to the corresponding point in the scene. In depth images, each pixel contains the distance between the camera and the plane parallel to the camera which the corresponding point lies on."
- Install blenderproc
- Install dcargs via
blenderproc pip install git+https://github.com/brentyi/dcargs.git
- Install Shapenet
- You can now run
scripts/blenderproc_offline_data
to generate offline data for the gym environment^^
blenderproc run scripts/blenderproc_offline_data.py view-planner.num-cam-positions <num cam positions> --shapenet-path <path/to/ShapenetCoreV2> --candidate-view-radius 0.8 --obj <object>
Example train / eval data generation script:
blenderproc run scripts/blenderproc_offline_data.py --shapenet-path ../ShapeNetCore.v2/ --view-planner.num-cam-positions 3 --save-data-type eval --convert-background-to-white --save-ray-info
--save-ray-info
: saves rays for ray distance supervision (c.f. DS-NeRF)
--convert-background-to-white
: converts (infinite distance) background to RGB color (255, 255, 255)
NOTE: dcargs replaces underscores with dashes on the command line, and supports nested args (e.g.
view-planner.num-cam-positions
)
The script found in blenderproc_offline_data.py
creates a directory <offline gym data dir>/<save data type>/<shapenet object name>
. The folder structure within this directory follows that of nerf, ie.
transforms_train.json
transforms_val.json
images/
img_0.png
img_1.png
...
These instructions have been tested extensively on Ubuntu 18.04.