Skip to main content

A lightweight genetic algorithm package for optimization

Project description

Genetic Algorithm Class

Now as package: last version

GitHub release (with filter) PyPI - Downloads GitHub Repo stars

Python implementation of a genetic algorithm to solve optimization problems with n control variables.

Description

A genetic algorithm (GA) is a search heuristic part of a broader family of algorithms called evolutionary algorithms (EAs). EAs are population-based metaheuristics that integrate mechanisms inspired by biological evolution such as reproduction, mutation, selection. The GA algorithm is used particularly in optimization problems where calculating gradients of an objective function is problematic or not possible.

Steps in GA

  1. Initialization: initialize a population of individuals or candidate solutions to the problem. This initialization can be done by means of random sampling. Each individual is defined by an encoding which we call genes.
  2. Selection: calculate the best candidates based on a defined fitness function we want to optimize. We select the best j parents which will be combined. The parameter j is arbitrary.
  3. Crossover: we combine the genes of the parents to produce an offspring. These are s new individuals in our population.
  4. Mutation: add randomness to the generated offspring. We can add e.g. a Gausian noise to one of the genes of the offspring for each individual.
  5. Replacement: select the l fittest individuals of the population to evaluate on the next epoch.

We repeat these evolution steps for certain amount of epochs or until an exit condition is met.

GA implementation

Dependencies

Hyperparameters

  • Individuals:

    • lower_bound
    • upper_bound
    • number_of_genes: dimension of the search space. In this implementation it indicates the shape of the array that represents each individual.
    Note: The number of genes of each individual and the fitness function must be congruent
  • Population:

    • n_parents: j parents.
    • offspring_size: the s new individuals from combining jparents.
    • mutation_mean, mutation_sd: mean and standard deviation of the Gaussian noise added during the mutation step.
    • size: maximum size of the population or l fittest individuals to survive for the next epoch.
  • Evolution:

    • epochs: number of times we repet each evolution step.

Example

An example fitness function could be something like this:

def fitness(x, y):
    return x*(x-1)*np.cos(2*x-1)*np.sin(2*x-1)*(y-2)

We can limit our search to evaluate individuals formula within the domain formula with the ind_parameters dictionary. Likewise, we control the population parameters with the pop_parameters.

# example.py
import numpy as np
from ga.evolution import Evolution

# Define a fitness function
def fitness(x, y):
    return x * (x - 1) * np.cos(2 * x - 1) * np.sin(2 * x - 1) * (y - 2)

# Define parameter for each individual
ind_parameters = {'lower_bound': -2,
                  'upper_bound': 2,
                  'number_of_genes': 2}

# Define parameter for the entire population
pop_parameters = {'n_parents': 6,
                  'offspring_size':(2, ind_parameters['number_of_genes']),
                  'mutation_mean': 0.25,
                  'mutation_sd': 0.5,
                  'size': 10}
def example():
    # Instantiate an evolution
    evo = Evolution(pop_parameters, ind_parameters, fitness)
    # Repeat evolution step 200 epochs
    epochs = 10000
    # Record fitness history
    history = []
    x_history = []
    y_history = []
    for _ in range(epochs):
        print('Epoch {}/{}, Progress: {}%\r'.format(_+1, epochs, np.round(((_+1)/epochs)*100, 2)), end="")
        evo.step()
        history.append(evo._best_score)
        x_history.append(evo._best_individual[0][0])
        y_history.append(evo._best_individual[0][1])

    print('\nResults:')
    print('Best individual:', evo.solution.best_individual)
    print('Fitness value of best individual:', evo.solution.best_score)

example()

The results are really close to the global optimum within this domain and the best individual does not change after 50 epochs.

# Output
Epoch  200/200, Progress 100.0%
Results:
Best individual: [-1.52637873, -2.        ]
Fitness value of best individual: 7.4697265870418414

References

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

generic-algorithm-light-0.2.1.tar.gz (154.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

generic_algorithm_light-0.2.1-py3-none-any.whl (7.0 kB view details)

Uploaded Python 3

File details

Details for the file generic-algorithm-light-0.2.1.tar.gz.

File metadata

  • Download URL: generic-algorithm-light-0.2.1.tar.gz
  • Upload date:
  • Size: 154.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/4.0.2 CPython/3.11.8

File hashes

Hashes for generic-algorithm-light-0.2.1.tar.gz
Algorithm Hash digest
SHA256 27dea809d60bad4391a5b6779b9f60c1719c24b5320dadd7e6fc85661ca32f8c
MD5 4adce555f9f276ed77322d81ddc9257a
BLAKE2b-256 8d4d45154881335f7bf3b5734e4bfbb1726f2e1d5a548fcf79c0e52561992feb

See more details on using hashes here.

File details

Details for the file generic_algorithm_light-0.2.1-py3-none-any.whl.

File metadata

File hashes

Hashes for generic_algorithm_light-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 1231170cbbb361004d6ce47018a93855ee8bcdbdaee6431f29dcb254db104540
MD5 ae5d68092f6b2a0f31ccf9994b4e0894
BLAKE2b-256 0d7f112f9e3c67f1d30ecd0b2792f2dd6afa14c9c3621584483f9734203d1f33

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page