Skip to main content

Evolution inspired optimisation algorithms

Project description

Description

This package is a Python aims at providing a range of nature-inspired optimisation algorithms. The purpose of an optimisation algorithm is to find the maximum or minimum of a function.
Genetic algorithms are particularly useful when it comes to high-dimensional, non-linear and non-convex problems (e.g. finding a needle in a 10-dimensional hay). They have a wide range of application from supply chain optimisation to hyperparameter tuning. This first version includes an implementation of genetic algorithm with "regularized evolution".

Genetic algorithms are very useful in machine learning, especially in hyperparameter tuning. The example folder contains two examples of genetic algorithms used to:

  1. Optimise the architecture and hyperparameters of a Neural Network (link)
  2. Tune the hyperparameters of a Support Vector Machine and XGBoost model (link)

The full documentation can be found here.

Installation

This package can be installed with "pip" or by cloning this repository

$ pip install evolution_opt

Dependencies

To install and run evolution_opt make sure that you have installed the following packages

$ pip install numpy pandas scipy matplotlib

Importing evolution_opt

import numpy as np
import pandas as pd
from evolution_opt.genetic import *

Example Usage

1) Define a function to be optimised

This function has to take a dictionary of parameter as argument:

def difficult_problem(param_dict):
    result = param_dict['x']**2 + (param_dict['y']+1)**2
    if param_dict['luck'] == 'lucky':
        pass
    else:
        result += 10
    return result

This function could be any process that takes parameters as input and outputs a scalar value.

It could evaluate a model's cross-validation score based on given hyperparameter values, a profit/cost function, the efficiency of a resourcing plan... The possibilities are limitless.

2) Define a search space

search_space = [
    Integer(-100,100, 'x'),
    Real(-100,100, 'y'),
    Categorical(['lucky', 'unlucky'], 'luck')
]

The search space can be composed of Integer, Real and Categorical variables. Numeric parameters are initialised with a lower bound, upper bound and a parameter name. Categorical parameters require a list of possible values and a parameter name.

3) Run the evolutionary algorithm

best_params = optimise(difficult_problem,search_space,minimize=True, 
                           population_size=20,n_rounds=500)   

# Prints:
# Number of Iterations: 500
# Best score: 0.00410559779230605
# Best parameters: {'x': -0.0, 'y': -1.0640749388786759, 'luck': 'lucky'}

Credits

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

evolution_opt-0.0.5.tar.gz (10.6 kB view details)

Uploaded Source

Built Distribution

evolution_opt-0.0.5-py3-none-any.whl (10.9 kB view details)

Uploaded Python 3

File details

Details for the file evolution_opt-0.0.5.tar.gz.

File metadata

  • Download URL: evolution_opt-0.0.5.tar.gz
  • Upload date:
  • Size: 10.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/46.2.0 requests-toolbelt/0.9.1 tqdm/4.42.1 CPython/3.7.6

File hashes

Hashes for evolution_opt-0.0.5.tar.gz
Algorithm Hash digest
SHA256 eddc48ce520401a109b530a512e5c870b7625cc8f9506329ddde865d327cefcb
MD5 d50196bd60cd89862138637fdd95ec30
BLAKE2b-256 f704b31a27c3773e9e22f2bf76472a30634c98bd369be8b2ec3242a76d7609b7

See more details on using hashes here.

File details

Details for the file evolution_opt-0.0.5-py3-none-any.whl.

File metadata

  • Download URL: evolution_opt-0.0.5-py3-none-any.whl
  • Upload date:
  • Size: 10.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/46.2.0 requests-toolbelt/0.9.1 tqdm/4.42.1 CPython/3.7.6

File hashes

Hashes for evolution_opt-0.0.5-py3-none-any.whl
Algorithm Hash digest
SHA256 84aa32711db51776ecd868fa73ce5215db0f5cc86c8ca770ac0387dd88f1a4b3
MD5 52e41e9bc97daf56e74c2d75bfb43dc4
BLAKE2b-256 081eb8ef8400b9a4320c4840568ed347bdca11392714ff156e23bb33a10bbbb4

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page