Skip to main content

NeuroEvolution of Augmenting Topologies in Python

Project description

prettyNEAT

swingup biped ant racer

Neuroevolution of Augmenting Topologies (NEAT) algorithm in numpy, built for multicore use and OpenAI's gym interface.

Original paper by Ken Stanley and Risto Miikkulainen: Evolving Neural Networks Through Augmenting Topologies

Implementation created by Adam Gaier and originally released as part of the Google Brain Tokyo Workshop.

Installation

prettyNEAT can be downloaded from pypi:

pip install prettyNEAT

Or installed locally

git clone https://github.com/d9w/prettyNEAT
cd prettyNEAT
python setup.py install

Other dependencies

The provided example scripts which evolve individuals for gym environments have further dependencies, including mpi4py for distributed evaluation. To install these additional dependencies, do:

pip install -r requirements.txt

Running NEAT

swingup swingup

The 'cartpole_swingup' task doesn't have any dependencies and is set as the default task, try it with the default parameters:

Training command:

python evolution.py

To view the performance of a trained controller (default log/test_best.out loaded):

python evaluation.py

To load and test a specific network:

python neat_test.py -i demo/swingup.out

Data Gathering and Visualization

Data about each run is stored by default in the log folder with the test prefix, though a new prefix can be specified:

python evolution.py -o myExperiment_

Output files will still be placed in the 'log' folder but prepended with the 'myExperiment_' prefix

In addition to the best performing individual, prettyNEAT regularly updates a _stats.out file with run statistics. These statistics are stored as comma seperated values, and some helper functions are shown to display these statistics as well as the topology of the evolved networks.

see prettyNeat_demo.ipynb notebook for example usage.

Distributed evaluation

prettyNeat uses an ask/tell pattern to handle parallelization:

  neat = Neat(hyp)  # Initialize Neat with hyperparameters
  for gen in range(hyp['maxGen']):        
    pop = neat.ask()            # Get newly evolved individuals from NEAT  
    reward = batchMpiEval(pop)  # Send population to workers to evaluate
    neat.tell(reward)           # Send fitness values back to NEAT    

The number of workers can be specified when called from the command line:

python evo_distributed.py -n 8

Algorithm hyperparameters are stored in a .json file. Default parameters specified with -d, modification with a -p:

python evo_distributed.py -d config/neat_default.json

or to use default except for certain changes

python evo_distributed.py -p config/swingup.json       # Swing up with standard parameters
python evo_distributed.py -p config/swing_allAct.json  # Swing up but allow hidden nodes to have several activations

The full list of hyperparameters is explained in hypkey.txt


Extensions and Differences from Canonical NEAT

A few key differences and common extensions from the original NEAT paper have been included:

  • Compatibility threshold update

    • The compatibility threshold is regularly updated to keep the number of species near a desired number. Though use of this update is widespread and mentioned on the NEAT User's Page, to my knowledge it has never been explicitly mentioned in a publication.
  • Activation functions

    • Unless a specific activation function to be used by all hidden nodes is specified in the hyperparameters, when a new node is created it can chosen from a list of activation functions defined by the task. A probability of mutating the activation function is also defined. This allows the code to easily handle extensions for HyperNEAT and CPPN experiments.
  • Rank-based Fitness

    • The canonical NEAT uses raw fitness values to determine the relative fitness of individuals and species. This can cause scaling problems, and can't handle negative fitness values. PrettyNEAT instead ranks the population and assigns each individual a real-valued fitness based on this ranking.
  • Multi-objective

    • Many extensions of NEAT involve optimizing for additional objectives (age, number of connections, novelty, etc) and we include non-dominated sorting of the population by multiple objectives. The probability that these alternate objectives are applied can also be tuned (e.g. normal optimization, but 20% chance of ranking based on fitness and number of connections). This can be used with or without speciation.
  • Weight Tuning with CMA-ES

    • Networks produced by PrettyNEAT are exported in the form of weight matrices and a vector of activation functions. We provide an interface to further tune the weights of these networks with CMA-ES:
    python cmaes.py -i log/test_best.out
    

Other NEAT resources

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

prettyNEAT-0.0.2.tar.gz (39.0 kB view details)

Uploaded Source

File details

Details for the file prettyNEAT-0.0.2.tar.gz.

File metadata

  • Download URL: prettyNEAT-0.0.2.tar.gz
  • Upload date:
  • Size: 39.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/3.10.0 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.51.0 CPython/3.8.7

File hashes

Hashes for prettyNEAT-0.0.2.tar.gz
Algorithm Hash digest
SHA256 0e7e3ac48e0ed571317b15938b6f4e0e8198cac18cca4593d4abdff9949703bf
MD5 ff42d62830b3f4d00ba2357320219c4f
BLAKE2b-256 bbaa63d659ac413c0d883587682eb13fc8373fe4dc5acc39ae97bfd8810824cb

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page