Skip to main content

Simple Particle Swarm Optimization

Project description

Optimus-Beez

This is a Particle Swarm Optimization (PSO) package. The PSO used is the simplest version presented by Maurice Clarc in "Particle Swarm Optimization" with some minor modifications.

Installation

Run the following command:

pip install optimusbeez Make sure you have installed the latest version. Old versions may be faulty.

How to use Optimus-Beez

Choosing the function to evaluate

The default function to evaluate is Rosenbrock. To change this, first check out evaluate.py. This file contains predefined functions including Rosenbrock, Alpine, Griewank, and a flat surface. If the function you wish to use is not defined, then go ahead and add it to evaluate.py. Then go to function_info.txt and

  • change fn_name to the name of your function
  • set the search space with xmin and xmax Note that fn_info and constants are both dictionaries. To get information about the other keys in the fn_info dictionary, use help() on the Experiment object of optimusbeez. If you define your own function, you must also import it in PSO.py.

Creating an Experiment

Optimusbeez has an Experiment class. The first steps to using the optimusbeez package are

import optimusbeez as ob experiment = ob.Experiment() If no arguments are passed to the Experiment object, then it is created with default parameters (hereafter referred to as constants) from the file 'optimal_constants.txt' and function info from 'fn_info.txt'. You can easily change these after creating the experiment object. For example, experiment.N = 20 changes the number of particles in the swarm to 20. You can also change the evaluation function. experiment.fn_name = 'Alpine' To see the current configuration of constants and function info, you can use experiment.constants() experiment.fn_info()

Running the Experiment

To evolve the swarm through time, you must run the experiment.

experiment.run(1000) The argument passed to the run function is the number of evaluations. The experiment will run and show a progress bar. If show_animation is set to True in fn_info, then an animation of the swarm will be shown at the end of the run. Results will be printed on the screen as well as returned in the format (best found position, value at that position, difference from optimal_f). optimal_f is the expected minimum value of the function, usually 0. It is defined in the fn_info dictionary.

Running the Experiment several times

The function evaluate_experiment() is very useful to gauge the constants configuration and how the optimization algorithm fares in general. This function takes 3 arguments - an experiment object, the number of evaluations, and the number of times the experiment should be run. The function returns the average best value and its standard deviation. It also plots a histogram of the results from 0 to 10.

Optimizing the Experiment constants

The PSO algorithm itself has 5 parameters: phi, N, k, time steps, and repetitions. These can be changed manually but Optimusbeez can also optimize itself with the function optimize_constants(). This function takes 4 arguments - allowed evaluations, allowed deviation from the number of evaluations, optimization time steps, and optimization repetitions. This function creates a swarm of particles with a 5D array of positions corresponding to the 5 constants. It runs an experiment with these constants and moves the swarm towards the best constants configuration. Usually it does a pretty good job, but it is better to test the constants configuration afterwards and adjust the constants manually.

Testing

Nose is used to test the code. All tests are located in the 'tests' folder.To run the tests, execute:

nosetests

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

optimusbeez-1.0.0.tar.gz (18.4 kB view details)

Uploaded Source

Built Distribution

optimusbeez-1.0.0-py3-none-any.whl (19.7 kB view details)

Uploaded Python 3

File details

Details for the file optimusbeez-1.0.0.tar.gz.

File metadata

  • Download URL: optimusbeez-1.0.0.tar.gz
  • Upload date:
  • Size: 18.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/49.2.0 requests-toolbelt/0.9.1 tqdm/4.42.1 CPython/3.7.6

File hashes

Hashes for optimusbeez-1.0.0.tar.gz
Algorithm Hash digest
SHA256 6e9429bab6f4d50baadc02d15e6fb05603248de155a70f5ee76b71734d5b937d
MD5 faac41c29bfa2d03832f31a09c6fabf0
BLAKE2b-256 439f2f1f6220dc5de8d50e0027628ebcb5e2f44d9f5650293d0be300cceeebd8

See more details on using hashes here.

File details

Details for the file optimusbeez-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: optimusbeez-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 19.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/49.2.0 requests-toolbelt/0.9.1 tqdm/4.42.1 CPython/3.7.6

File hashes

Hashes for optimusbeez-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 7fd87c42f90571a41e3a845da3605264f92591adfa60680c8ff40c48b007e8b7
MD5 85e5b0375d92ea8d9fbb189eee75878f
BLAKE2b-256 e0b4be399b7a5c67604ba8fddb9923fe3c67ba5dd516bff4afd192d27dbb3ccf

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page