Skip to content

Latest commit

 

History

History
executable file
·
61 lines (45 loc) · 2.1 KB

Environments.md

File metadata and controls

executable file
·
61 lines (45 loc) · 2.1 KB

Enviroments

The suppported Environments are in the sources folder.
The source.py contains the parent class.
Currently, the integrated platforms are OpenAI's Gym, PyGame, Unity ML, CARLA and V-Rep

The Environments in the folder are:

Vector Inputs (1 dimension):

- Pygame
	- Chase
- Gym
	- CartPole  
	- Continuous Mountain Car
	- Mujoco
		- Pendulum
		- HalfCheetah
		- Hopper
		- Reacher
- Unity Machine Learning
	- 3D Ball
- V-Rep

Image Inputs (2 dimensions):

- Pygame
	- Catch
- Gym
	- Breakout
	- Pong  
- CARLA

Datasets

The datasets currently stored are:

	- Gym Cartpole
	- Unity 3DBall
	- Gym HalfCheetah

Notes

  1. It is easy to add new environments.

  2. There is a need to place the unity build files of the respective environments in the learning/sources/unity/ folder

  3. To export a Unity Env .bytes file to run the trained model on Unity:
    Execute this function on the algorithm when a saved trained model of the env is in trained_models folder:
    from sources.source_unity_exporter import *
    export_ugraph (self.brain, "./trained_models/" + trainedmodel, envname, nnoutput)
    raise SystemExit(0)
    #Example with PPO and 3DBall: trainedmodel = "unity_3dball_ppo/", envname = "3dball", nnoutput = "Actor/Mu/MatMul"

  4. To use CARLA download the 0.9.2 compiled version and put the files in the /sources/carla/ folder

Credits

  1. Credits to the code (V-Rep interface) in sources/vrep/: fgolemo