Skip to content

NEAT-RL/NEAT

Repository files navigation

NEAT

NEAT implementation based on neat-python library

Setup

First install python (v 3.5 is recommended for OpenAI libraries) and install conda or miniconda. Miniconda is recommended as its smaller and useful if you have space requirements.

Set up a conda environment by following the Project dependencies repo

Install the organisation's custom gym gym-ple libraries into the conda environment. These libraries extend openai gym. If you are looking on running gym-ple games, then you will need to install the PyGame-Learning-Environment from here.

Running algorithm

The main file is NEAT.py. Experiment logs are written into log directory.

Run this file and if you want to save the std outputs for clarity use following command:

python NEAT.py [Environment Id] > output.log 2&>1

Current Environments Configured

Enviornment id Info
CartPole-v0 Standard implementation of gym cartpole
CartPole-v1 Standard implementation of gym cartpole
MountainCar-v0 Standard implementation of gym mountain car problem
MountainCarExtraLong-v0 Custom implementation of gym mountain car problem where the episode length is 999.

Use any of these environment id as an argument.

Adding your own environment

The code uses the environment id to extract properties from the properties directory. The arrangement of the properties directory is

properties/<Environment id>/Config
properties/<Environemt id>/neatem_properties.ini

Adding your environment involves creating a new directory of your environment id and add the two properties files used by the algorithm

About

NEAT implementation

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published