Skip to content

22.12.2017 DREAM report: This module would take Baxter trajectories produced by quality diversity (QD) search and generate new trajectories based on these.

Notifications You must be signed in to change notification settings

Instassa/DREAM-DCGAN-for-Baxter-unconditional-trajectories

Repository files navigation

DREAM-DCGAN-for-Baxter-unconditional-trajectories

22.12.2017 DREAM report: This module would take Baxter trajectories produced by quality diversity (QD) search and generate new trajectories based on these.

We are currently working on extending this code to producing novel trajectories, conditioned on the landing point of the object.

Short Summary:

The format_data_unconditional.py function takes archive_3600_trj.dat and motion folder as an input and outputs a data folder in a format digestable by DCGAN function.
The main.py function takes the name of the data folder, training stage (i.e. train/test), plus (optional) specific information, e.g. input/output height/width, a preferred number of training epochs and so on. This function produces the batches of samples (64 in each) that can be then used to control Baxter in Gazebo (or later at some point a real Baxter robot)

Prerequisites:

Usage:

After downloading this repository, please add the archive_3600_trj.dat and unzipped motion folder in DREAM-DCGAN-for-Baxter-unconditional-trajectories/data folder:

DREAM-DCGAN-for-Baxter-unconditional-trajectories
├── data
    ├── archive_3600_trj.dat
    └── motion
└── ...

Then run the following comands:

cd ./DREAM-DCGAN-for-Baxter-unconditional-trajectories
python format_data_unconditional.py
python main.py --dataset formated_unconditional_trajectories --train

These commands are formating data (so it is the dataset of the first 1000 trajectories in the original dataset) and then training DCGANs to generate new trajectories.

Once the model is sufficiently trained, run:

python main.py --dataset formated_unconditional_trajectories --test

This will produce 10 batches (64 samples in each) of the newly generated trajectories for Baxter.

The generated data

The data generated by the code in this repository has the same 26 dimentions in the same order as the motion data produced by QD search. The only difference is that we interpolate and compress the trajectories to be 175 in length.

In addtion to that in both training and test mode the code produces 3D visualistions of the end effector trajectories before the throw (in Cartesian coordinates). Images should look something like this:

It is set to display 10 out of 64 (the number can be changed in the utils.py function). The produced trajectories quality depends on initialisisation and on whether the model reached convergence, so some trained models can produce way more fuzzy/smooth results than the others.

Related algorithms:

This is based on the following DCGAN code: https://github.com/carpedm20/DCGAN-tensorflow

About

22.12.2017 DREAM report: This module would take Baxter trajectories produced by quality diversity (QD) search and generate new trajectories based on these.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages