Skip to content

Commit

Permalink
Update devices + related docs
Browse files Browse the repository at this point in the history
  • Loading branch information
kevin-thankyou-lin committed Nov 22, 2024
1 parent 74c6aac commit 1872481
Show file tree
Hide file tree
Showing 3 changed files with 46 additions and 41 deletions.
34 changes: 4 additions & 30 deletions docs/algorithms/demonstrations.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,39 +2,13 @@

## Collecting Human Demonstrations

We provide teleoperation utilities that allow users to control the robots with input devices, such as the keyboard and the [SpaceMouse](https://www.3dconnexion.com/spacemouse_compact/en/). Such functionality allows us to collect a dataset of human demonstrations for learning. We provide an example script to illustrate how to collect demonstrations. Our [collect_human_demonstrations](https://github.com/ARISE-Initiative/robosuite/blob/master/robosuite/scripts/collect_human_demonstrations.py) script takes the following arguments:
We provide teleoperation utilities that allow users to control the robots with input devices, such as the keyboard, [SpaceMouse](https://www.3dconnexion.com/spacemouse_compact/en/) and mujoco-gui. Such functionality allows us to collect a dataset of human demonstrations for learning. We provide an example script to illustrate how to collect demonstrations. Our [collect_human_demonstrations](https://github.com/ARISE-Initiative/robosuite/blob/master/robosuite/scripts/collect_human_demonstrations.py) script takes the following arguments:

- `directory:` path to a folder for where to store the pickle file of collected demonstrations
- `environment:` name of the environment you would like to collect the demonstrations for
- `device:` either "keyboard" or "spacemouse"

### Keyboard controls

Note that the rendering window must be active for these commands to work.

| Keys | Command |
| :------: | :--------------------------------: |
| q | reset simulation |
| spacebar | toggle gripper (open/close) |
| w-a-s-d | move arm horizontally in x-y plane |
| r-f | move arm vertically |
| z-x | rotate arm about x-axis |
| t-g | rotate arm about y-axis |
| c-v | rotate arm about z-axis |
| ESC | quit |

### 3Dconnexion SpaceMouse controls

| Control | Command |
| :-----------------------: | :-----------------------------------: |
| Right button | reset simulation |
| Left button (hold) | close gripper |
| Move mouse laterally | move arm horizontally in x-y plane |
| Move mouse vertically | move arm vertically |
| Twist mouse about an axis | rotate arm about a corresponding axis |
| ESC (keyboard) | quit |

- `device:` either "keyboard" or "spacemouse" or "mjgui"

See the [devices page](https://robosuite.ai/docs/modules/devices.html) for details on how to use the devices.

## Replaying Human Demonstrations

Expand Down Expand Up @@ -79,7 +53,7 @@ The reason for storing mujoco states instead of raw observations is to make it e

## Using Demonstrations for Learning

We have recently released the [robomimic](https://arise-initiative.github.io/robomimic-web/) framework, which makes it easy to train policies using your own [datasets collected with robosuite](https://arise-initiative.github.io/robomimic-web/docs/introduction/datasets.html#robosuite-hdf5-datasets), and other publically released datasets (such as those collected with RoboTurk). The framework also contains many useful examples for how to integrate hdf5 datasets into your own learning pipeline.
The [robomimic](https://arise-initiative.github.io/robomimic-web/) framework makes it easy to train policies using your own [datasets collected with robosuite](https://arise-initiative.github.io/robomimic-web/docs/introduction/datasets.html#robosuite-hdf5-datasets). The framework also contains many useful examples for how to integrate hdf5 datasets into your own learning pipeline.

The robosuite repository also has some utilities for using the demonstrations to alter the start state distribution of training episodes for learning RL policies - this have proved effective in [several](https://arxiv.org/abs/1802.09564) [prior](https://arxiv.org/abs/1807.06919) [works](https://arxiv.org/abs/1804.02717). For example, we provide a generic utility for setting various types of learning curriculums which dictate how to sample from demonstration episodes when doing an environment reset. For more information see the `DemoSamplerWrapper` class.

Expand Down
51 changes: 41 additions & 10 deletions docs/modules/devices.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,16 +10,20 @@ We support keyboard input through the OpenCV2 window created by the mujoco rende

Note that the rendering window must be active for these commands to work.

| Keys | Command |
| :------- | :--------------------------------- |
| q | reset simulation |
| spacebar | toggle gripper (open/close) |
| w-a-s-d | move arm horizontally in x-y plane |
| r-f | move arm vertically |
| z-x | rotate arm about x-axis |
| t-g | rotate arm about y-axis |
| c-v | rotate arm about z-axis |
| ESC | quit |
| Keys | Command |
| :------------------ | :----------------------------------------- |
| Ctrl+q | reset simulation |
| spacebar | toggle gripper (open/close) |
| up-right-down-left | move horizontally in x-y plane |
| .-; | move vertically |
| o-p | rotate (yaw) |
| y-h | rotate (pitch) |
| e-r | rotate (roll) |
| b | toggle arm/base mode (if appli cable) |
| s | switch active arm (if multi-armed robot) |
| = | switch active robot (if multi-robot env) |
| ESC | quit |


## 3Dconnexion SpaceMouse

Expand All @@ -35,3 +39,30 @@ We support the use of a [SpaceMouse](https://www.3dconnexion.com/spacemouse_comp
| Move mouse vertically | move arm vertically |
| Twist mouse about an axis | rotate arm about a corresponding axis |
| ESC (keyboard) | quit |


## Mujoco GUI Device

To use the Mujoco GUI device for teleoperation, follow these steps:

1. Set renderer as `"mjviewer"`. For example:

```python
env = suite.make(
**options,
renderer="mjviewer",
has_renderer=True,
has_offscreen_renderer=False,
ignore_done=True,
use_camera_obs=False,
)
```

Note: if using Mac, please use `mjpython` instead of `python`. For example:

```mjpython robosuite/scripts/collect_human_demonstrations.py --environment Lift --robots Panda --device mjgui --camera frontview --controller WHOLE_BODY_IK```

2. Double click on a mocap body to select a body to drag, then:

On Linux: `Ctrl` + right click to drag the body's position. `Ctrl` + left click to control the body's orientation.
On Mac: `fn` + `Ctrl` + right click.
2 changes: 1 addition & 1 deletion docs/simulation/device.rst
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
Device
======

Devices allow for direct real-time interfacing with the MuJoCo simulation. The current support devices are ``Keyboard`` and ``SpaceMouse``.
Devices allow for direct real-time interfacing with the MuJoCo simulation. The currently supported devices are ``Keyboard``. ``SpaceMouse`` and ``MjGUI``.

Base Device
-----------
Expand Down

0 comments on commit 1872481

Please sign in to comment.