Skip to main content

Evolutionary Optimization

Project description

Evolutionary Algorithm Optimization (EAO)

Package for optimization using Evolutionary Algorithms (EA), providing base classes for individuals and an evaluation module.

Version History

  • 0.4 Removed necessity to manually assign loss_ attribute
  • 0.5 Added extensive logging capabilities; will be cleaned up a little in later versions
  • 0.6 Fixed critical bug that affects optimization with more than one parent
  • 0.7 Made logging much cleaner; added plus-selection

Documentation

eao (Evolutionary Algorithm Optimization)

This library provides a function minimizer using a simple modular evolutionary algorithm scheme that can be adapted to all kinds of target domains. The algorithm performs optimization steps (generations) on a population of solutions by iteratively creating offspring through recombination and mutation, evaluation the offspring using a customizable fitness function and selecting new parents for the next generation, either through plus selection (best individuals from offspring plus previous parents) or comma selection (best individuals from offspring only). Additionally, eao has useful logging capabilities, that allow to fully retrace each optimization run for further analysis.

In its core, eao requires the user to extend 2 classes:

  • Individual: Class representing the target domain, e.g. real numbers, vectors, bit strings or more complex objects
  • Evaluator: Class containing the evaluation method, which takes an object of Individual and assigns it a loss value (or inverse fitness value) according to which it is ranked

The following sections will explain how to implement these classes and how to use them with the optimizer. As an example, we will implement an optimization problem on real vectors.

Individual

In a first step, we will create a subclass of Individual called RealVector that will represent vectors of float values with fixed size. I'll be using numpy arrays as the actual internal data structure and write a wrapper that implements the methods required for individuals. For a minimal implementation we will need the following methods:

copy() : This method returns a (deep) copy of the object. This is necessary because objects are copied between populations internally, and this method allows us to manually copy internal class attributes that require manual deep copying on their own, such as numpy arrays.

random() : This is a static method (@classmethod) and returns a random instance of our data type

mutate(*kwargs) : This method performs mutation in-place, i.e. it introduces a little variation to the individual. For real vectors, we will add some random noise to one of the elements. Additional parameters can be passed to this function, which can be set via the configuration dict, more on this later.

cross(other, *kwargs) : This performs crossover (recombination) in-place, taking another individual as input. For our example, we will perform one-point crossover, meaning we'll sample a cutting point and take one part from either parent. Just as mutate, this method can take custom parameters. Also, if you don't want to use crossover, you can skip this method.

After implementing these methods for our example application, the RealVector class looks like this:

import numpy as np
from eao import Individual, Evaluator, Optimizer

class RealVector(Individual):

    def __init__(self, vec):
        self.vec = vec

    def copy(self):
        return RealVector(self.vec.copy())

    @classmethod
    def random(cls):
        return cls(np.random.uniform(-1, 1, size=16))

    def mutate(self):
        index = np.random.choice(16)
        self.vec[index] += np.random.normal(loc=0, scale=0.1)

    def cross(self, other):
        index = np.random.choice(17)
        self.vec[ix:] = other.vec[ix:]

As you can see we'll be working with RealVectors of size 16, and random instances will have elements uniformly distributed between -1 and 1. When implementing random, make sure the initial individuals are not unreasonable, e.g. much too large or too small - this will make it difficult to reach the solution by small mutations.

Evaluator

The Evaluator class contains the loss function that we'll have to specify for our optimization problem. We only need to implement a single method:

class RealVectorEvaluator(Evaluator):

    def eval(self, rv):
        even_indices = rv[::2]
        odd_indices = rv[1::2]
        return np.sum((even_indices-1)**2) + np.sum((odd_indices+1)**2)

For our example, we'll be be adding all even indices' distance to 1 and all odd indices' distance to -1, so that our global optimum is a vector of alternating 1 and -1. This nonsensical loss function is purely for demonstration purposes.

Configuration

  • Coming soon!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

eao-0.8.tar.gz (21.6 kB view details)

Uploaded Source

Built Distribution

eao-0.8-py3-none-any.whl (22.1 kB view details)

Uploaded Python 3

File details

Details for the file eao-0.8.tar.gz.

File metadata

  • Download URL: eao-0.8.tar.gz
  • Upload date:
  • Size: 21.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.5.0.1 requests/2.23.0 requests-toolbelt/0.9.1 tqdm/4.46.0 CPython/3.8.3

File hashes

Hashes for eao-0.8.tar.gz
Algorithm Hash digest
SHA256 bc1104b9d54ce385aa7572d5d482c3a32bb07d8a8ffb16cb3f6710955ed0bec8
MD5 aa3996d227a58ae1bbd00b5ba1b6ee6e
BLAKE2b-256 8c9f3edf15faf76b9a7d8f17c86f210538657338ee0b6ccfd1125ea9193c576f

See more details on using hashes here.

File details

Details for the file eao-0.8-py3-none-any.whl.

File metadata

  • Download URL: eao-0.8-py3-none-any.whl
  • Upload date:
  • Size: 22.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.5.0.1 requests/2.23.0 requests-toolbelt/0.9.1 tqdm/4.46.0 CPython/3.8.3

File hashes

Hashes for eao-0.8-py3-none-any.whl
Algorithm Hash digest
SHA256 ecbe8228c0700bf9bcaf0ac36cb4afb8a216303de83e199589d99ac316f4da30
MD5 80cc994c8bd0b365cd97928975985ca9
BLAKE2b-256 2c993b73f03808f8dccb713b0016bd88a6e54f1b4f0b5efbcb8a5c16abed6460

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page