Skip to main content

Evolutionary algorithm for neural network structure

Project description

Tensor Evolution

Tensor-Evolution is a library for evolving neural network topology using a genetic algorithm. This library currently uses Deap as its evolutionary backend, and Tensorflow for the neural networks.

Note that this library doesn't build networks a single neuron at a time, the basic building blocks are entire layers.

Philosophy

Population members start as the input layer connected directly to the output layer. Mutation operators exist for inserting layers (from a list of supported types), deleting layers, and for mutating an existing layer's properties. A crossover operator is also implemented.

Fitness is evaluated by building, compiling, and training a model from each population member's genome. Training is done the standard way (i.e. via backpropagation, not through any evolutionary means).

Note that most layer types can be added amost anywhere in the genome. If the input shape isn't right, it's corrected (attempts are made to correct it intelligently, but if required it's forced to fit).

Supported Layer Types

This list is currently expanding. So far:

  • Dense
  • ReLu
  • Conv2D, 3D
  • Maxpool2D, 3D
  • Addition
  • BatchNorm
  • Flatten
  • LSTM
  • GlobalAvgPooling 1D
  • Embedding
  • Concat

Installation

pip install tensor-evolution

Usage

Running an Evolution

Start by importing the tensor_evolution module. This is the main driver for the evolution.

import tensorEvolution

Next, prepare your data as a tuple of four objects, like so:

data = x_train, y_train, x_test, y_test

Then create an evolution worker, and use that worker to drive the evolution:

worker = tensor_evolution.EvolutionWorker() worker.evolve(data=data)

Please reference the end to end examples for full details.

Configuration

Everything is configured via yaml file. For the moment, since you will need to clone the project to use it, just edit the default config.yaml file.

For example, to change population size to 30:

####
# Evolution Controls
####
...
pop_size: 30 #population size

Mutation rates, valid neural network layer types, input and output shapes, etc. are all controlled from the config file.

Project Status

Very much still a work in progress, (as is this readme), but it is functional. The mnist example runs just fine.

Dependencies

Library License
tensorflow Apache License 2.0
networkx BSD 3-Clause
ray Apache License 2.0
numpy BSD 3-Clause
deap GNU Lesser General Public License v3.0
matplotlib License Details
sympy License Details
graphviz MIT License

MNIST Results

The best individual after running MNIST with a population of 20 individuals for 10 generations:

MNIST Genome

_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 input_4 (InputLayer)        [(None, 28, 28)]          0         
                                                                 
 reshape (Reshape)           (None, 28, 28, 1)         0         
                                                                 
 conv2d (Conv2D)             (None, 28, 28, 16)        272       
                                                                 
 conv2d_1 (Conv2D)           (None, 28, 28, 8)         1160      
                                                                 
 flatten (Flatten)           (None, 6272)              0         
                                                                 
 dense (Dense)               (None, 10)                62730     
                                                                 
=================================================================
Total params: 64,162
Trainable params: 64,162
Non-trainable params: 0 

Auto MPG Dataset Results

The best individual after running Auto MPG with a population of 100 individuals for 20 generations:

AutoMPG Genome

__________________________________________________________________________________________________
 Layer (type)                   Output Shape         Param #     Connected to                     
==================================================================================================
 input_1 (InputLayer)           [(None, 9)]          0           []                               
                                                                                                  
 dropout (Dropout)              (None, 9)            0           ['input_1[0][0]']                
                                                                                                  
 add (Add)                      (None, 9)            0           ['input_1[0][0]',                
                                                                  'dropout[0][0]']                
                                                                                                  
 dense (Dense)                  (None, 256)          2560        ['add[0][0]']                    
                                                                                                  
 flatten (Flatten)              (None, 256)          0           ['dense[0][0]']                  
                                                                                                  
 dense_1 (Dense)                (None, 1)            257         ['flatten[0][0]']                
                                                                                                  
==================================================================================================
Total params: 2,817
Trainable params: 2,817
Non-trainable params: 0
__________________________________________________________________________________________________

Evaluation Results
3/3 [==============================] - 0s 0s/step - loss: 1.5367 - mean_absolute_error: 1.5367

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tensor-evolution-0.1.10.tar.gz (44.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

tensor_evolution-0.1.10-py3-none-any.whl (61.6 kB view details)

Uploaded Python 3

File details

Details for the file tensor-evolution-0.1.10.tar.gz.

File metadata

  • Download URL: tensor-evolution-0.1.10.tar.gz
  • Upload date:
  • Size: 44.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.9.13

File hashes

Hashes for tensor-evolution-0.1.10.tar.gz
Algorithm Hash digest
SHA256 ce949abb328e6500aba947df612563ff3fb81bb49430060ea30b4ecf692da78f
MD5 1c14cfe25dfb135467c7d16f0cd4cdca
BLAKE2b-256 f48efba109f006d1479d8031c010f6acf1edbff2f48b93635ade4418a7e46dc6

See more details on using hashes here.

File details

Details for the file tensor_evolution-0.1.10-py3-none-any.whl.

File metadata

File hashes

Hashes for tensor_evolution-0.1.10-py3-none-any.whl
Algorithm Hash digest
SHA256 ede404ed751ff000166943d87e11fda9cc6791f7cf8ad38de74ec5d0efdd1542
MD5 9388a400ddb50cea225b1e43d2482c75
BLAKE2b-256 a1ab8ec1418c64e697dff540a826b461b75b3dc35fb3df07f3034a9372ce3008

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page