Godot Engine as an OpenAI Gym environment for Reinforcement Learning.
- The 'Environment node', created by the user, implements the required methods (
execute_action(),get_observation(),get_reward(),reset()andis_done()). - The
GymGodotnode (GymGodot.tscn) bridges the 'Environment node' node and the Python side server.
gym-servercommunicates with its Godot client and exposes it as a Gym environment.
Communications between the server and client are done with WebSocket JSON messages (protocol.md)
- Download or clone this repo.
- Add
GymGodot.tscn,GymGodot.gdandWebSocketClient.gdfrom/gym-godotto your Godot project folder. Then add theGymGodot.tscnnode into your scene. - Create a node (the 'Environment node') that implements the required functions.
- In the inspector, set GymGodot Node's 'Environment Node' property to your 'Environment node'.
- On the python side, install gym-server with :
pip install -e gym-server. Use it in your training script like a regular Gym environment.
A step-by-step tutorial is available in the tutorial.ipynb notebook.
gym-godot/examples/cartpole/
Description : cartpole.md
gym-godot/examples/pendulum/
Description : pendulum.md
gym-godot/examples/mars_lander/
Description : mars_lander.md
-
Make sure to open the project (
gym-godot/project.godot) in the Godot Editor at least once before using the example environments (so that the resources are imported). -
Only tested on Linux & Godot 3.3.
-
The code follows the Gym API so it might work with other Gym-compatible frameworks but has only been tested with Stable-Baselines 3.
-
No current plan for further improvements, maintenance or support.


