Skip to main content

Heuristic and meta-heuristic optimisation suite in Python

Project description

FreeLunch - Meta-heuristic optimisation suite for python

PyPICode Tests Benchmark Coverage

Please note the minor changes to the optimiser call signature since 0.0.11, details below.


About

Freelunch is a convenient python implementation of a number of meta-heuristic optimisation (with an 's') algorithms.


Features

Optimisers

Your favourite not in the list? Feel free to add it.

  • Differential evolution freelunch.DE
  • Simulated Annealing freelunch.SA
  • Particle Swarm freelunch.PSO
  • Krill Herd freelunch.KrillHerd
  • Self-adapting Differential Evolution freelunch.SADE
  • Quantum Particle Swarm freelunch.QPSO

--Coming soon to 0.1.0--

  • Quantum Bees
  • Grenade Explosion Method
  • The Penguin one

Benchmarking functions

Tier list: TBA

  • N-dimensional Ackley function
  • N-dimensional Periodic function
  • N-dimensional Happy Cat function
  • N-dimensional Exponential function

Install

Install with pip (req. numpy).

pip install freelunch

Usage

Create instances of your favourite meta-heuristics!

import freelunch
opt = freelunch.DE(my_objective_function, bounds=my_bounds) # Differential evolution

Where,

  • obj: objective function that excepts a single argument, the trial vector x, and returns float or None. i.e. obj(x) -> float or None

  • bounds: Iterable bounds for elements of x i.e. bounds [[lower, upper]]*len(sol) where: (sol[i] <= lower) -> bool and (sol[i] >= upper) -> bool.

Running the optimisation

Run by calling the instance. There are several different calling signatures. Use any combination of the arguments below to suit your needs!

To return the best solution only:

quick_result = opt() # (D,)

To return optimum after n_runs:

best_of_nruns = opt(n_runs=n) # (D,)

To return optimum after n_runs in parallel (uses multiprocessing.Pool), see note below.:

best_of_nruns = opt(n_runs=n, n_workers=w, pool_args={}, chunks=1) # (D,)

Return best m solutions in np.ndarray:

best_m = opt(n_return=m) # (D, m)

Return json friendly dict with fun metadata!

full_output = opt(full_output=True)
    # {
    #     'optimiser':'DE',
    #     'hypers':...,
    #     'bounds':...,
    #     'nruns':nruns,
    #     'nfe':1234,
    #     'solutions':[sol1, sol2, ..., solm*n_runs], # All solutions from all runs sorted by fitness
    #     'scores':[fit1, fit2, ..., fitm*n_runs]
    # }

Customisation

Want to change things around?

  • Change the initialisation strategy

TBC

  • Change the bounding strategy

The simplest way to do this is to overwrite the optimiser.bounder attribute. There are a number of ready made strategies in freelunch.tech or alternatively define a custom method with the following call signature.

opt = fr.DE(obj, ...)

def my_bounder(p, bounds, **hypers):
    '''custom bounding method'''
    p.dna = ... # custom bounding logic

opt.bounder = my_bounder # overwrite the bounder attribute

# and then call as before
x_optimised = opt()
  • Changing the hyperparameters

Check out the hyperparameters and set your own, (defaults set automatically):

print(opt.hyper_definitions)
    # {
    #     'N':'Population size (int)',
    #     'G':'Number of generations (int)',
    #     'F':'Mutation parameter (float in [0,1])',
    #     'Cr':'Crossover probability (float in [0,1])'
    # }

print(opt.hyper_defaults)
    # {
    #     'N':100,
    #     'G':100,
    #     'F':0.5,
    #     'Cr':0.2
    # }

opt.hypers.update({'N':300})
print(opt.hypers)
    # {
    #     'N':300,
    #     'G':100,
    #     'F':0.5,
    #     'Cr':0.2
    # }

Benchmarks

Access from freelunch.benchmarks for example:

bench = freelunch.benchmarks.ackley(n=2) # Instanciate a 2D ackley benchmark function

fit = bench(sol) # evaluate by calling
bench.bounds # [[-10, 10],[-10, 10]]
bench.optimum # [0, 0] 
bench.f0 # 0.0

A note on running optimisations in parallel.

Because multiprocessing.Pool relies on multiprocessing.forking.pickle to send code to parallel processes, it is imperative that anything passed to the freelunch optimisers can be pickled. For example, the following common python pattern for producing an objective function with a single argument,

method = ... # some methods / args that are requred by the objective function
args = 

def wrap_my_obj(method, args):
    def _obj(x):
        return method(args, x)
    return _obj

obj = wrap_my_obj(method, args)

cannot be pickled because _obj is not importable from the top level module scope and will raise freelunch.util.UnpicklableObjectiveFunction . Instead consider using functools.partial i.e.

from functools import partial

method = ... # some methods / args that are requred by the objective function
args = ...


def _obj(method, args, x):
    return method(args, x)

obj = partial(_obj, method, args)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

freelunch-0.0.13.tar.gz (17.6 kB view details)

Uploaded Source

Built Distribution

freelunch-0.0.13-py3-none-any.whl (17.8 kB view details)

Uploaded Python 3

File details

Details for the file freelunch-0.0.13.tar.gz.

File metadata

  • Download URL: freelunch-0.0.13.tar.gz
  • Upload date:
  • Size: 17.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/32.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.8 tqdm/4.62.3 importlib-metadata/4.11.1 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.9.10

File hashes

Hashes for freelunch-0.0.13.tar.gz
Algorithm Hash digest
SHA256 6075bacdffe23b6c2ff997b504aeb36e573ce85250a429ec8e22b6e6c9119eea
MD5 1924c6b686a74df68b6e7d80d8df1552
BLAKE2b-256 f3839d6dc02d3f90449859307cd497615b20e26c07d5c78e448e9537efd96617

See more details on using hashes here.

Provenance

File details

Details for the file freelunch-0.0.13-py3-none-any.whl.

File metadata

  • Download URL: freelunch-0.0.13-py3-none-any.whl
  • Upload date:
  • Size: 17.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/32.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.8 tqdm/4.62.3 importlib-metadata/4.11.1 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.9.10

File hashes

Hashes for freelunch-0.0.13-py3-none-any.whl
Algorithm Hash digest
SHA256 f41702908a1e1c3b0ec9c676d22e20f23c587d51276dc092377729d081fa6ca4
MD5 6137c73cba0c6334d911e28d2324b428
BLAKE2b-256 a5da531b885eb924e7846d3a7fab8c9d9eb13e650e2c0a9ec007e8cb422adcec

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page