Evolutionary algorithm for neural network structure
Project description
Tensor Evolution
Tensor-Evolution is a library for evolving neural network topology using a genetic algorithm. This library currently
uses Deap as its evolutionary backend, and Tensorflow
for the neural networks.
Note that this library doesn't build networks a single neuron at a time, the basic building blocks are entire layers.
Philosophy
Population members start as the input layer connected directly to the output layer. Mutation operators exist for inserting layers (from a list of supported types), deleting layers, and for mutating an existing layer's properties. A crossover operator is also implemented.
Fitness is evaluated by building, compiling, and training a model from each population member's genome. Training is done the standard way (i.e. via backpropagation, not through any evolutionary means).
Note that most layer types can be added amost anywhere in the genome. If the input shape isn't right, it's corrected (attempts are made to correct it intelligently, but if required it's forced to fit).
Supported Layer Types
This list is currently expanding. So far:
- Dense
- ReLu
- Conv2D, 3D
- Maxpool2D, 3D
- Addition
- BatchNorm
- Flatten
- LSTM (experimental)
Installation
At the moment you'll need to clone the source and then either edit one of the examples, or create a new python file and import the tensor_evolution module. I am working on getting this project on pip.
Usage
Running an Evolution
Start by importing the tensor_evolution module. This is the main driver for the evolution.
import tensor_evolution
Next, prepare your data as a tuple of four objects, like so:
data = x_train, y_train, x_test, y_test
Then create an evolution worker, and use that worker to drive the evolution:
worker = tensor_evolution.EvolutionWorker()
worker.evolve(data=data)
Please reference the end to end examples for full details.
Configuration
Everything is configured via yaml file. For the moment, since you will need to clone the project to use it, just edit the default config.yaml file.
For example, to change population size to 30:
####
# Evolution Controls
####
...
pop_size: 30 #population size
Mutation rates, valid neural network layer types, input and output shapes, etc. are all controlled from the config file.
Project Status
Very much still a work in progress, (as is this readme), but it is functional. The mnist example runs just fine.
Dependencies
MNIST Results
The best individual after running MNIST with a population of 20 individuals for 15 generations:
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for tensor_evolution-0.1.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 5939a46b422dc40d60cebd4699d7199fa957ef84f7cd7baadcf5f5b8b156d216 |
|
MD5 | 393758cad59578bc900a724717a9c5d6 |
|
BLAKE2b-256 | b467a87daa5b6dc00f0af526a0e962843e3c70cd8a2b4836f5a722374bcb1a48 |