Skip to main content

A Python optimization toolkit focused on global optimization problems, featuring simple and customizable setup of various optimization algorithms implemented in C++.

Project description

quickopt

A Python optimization toolkit developed in C++ geared towards multimodal functions.

Documentation: Click Here!

Most of the information is found in the documentation - please refer to it! Tutorials, examples, descriptions/breakdowns of algorithms, and references are found here.

Installation

To install the package using PyPi, run the following command:

pip install quickopt

Description

This package is a Python optimization toolkit developed in C++ geared towards global optimization problems, such as those encountered in hyperparameter tuning. It contains a variety of optimization algorithms, which are listed below. The package is designed to be easy to use and flexible, allowing users to use popular optimization methods out of the box in as little as one line of code, while also allowing users to customize the optimization process to suit their needs. A list of the optimization algorithms included in the package is provided below:

  • Simulated Annealing: Compatible with string, float, and int inputs

  • Bayesian Optimization with Tree-Structured Parzen Estimators: Compatible with float inputs

  • Genetic Algorithm: Compatible with string, float, and int inputs

  • Particle Swarm Optimization: Compatible with float inputs

with more to come soon!

Quick Start

To use the package, import the desired optimization function from the package and run it with the desired parameters (described in detail in documentation). For example, to use the Particle Swarm Optimization algorithm for a double function, you would run the following code:

from quickopt.pso import pso
def objective(params): # The function being optimized - this must take inputs as a list
        return sum(x**2 for x in params)
    space_min = [-1.0, -1.0]
    space_max = [1.0, 1.0]

result = pso(funct=objective, space_min=space_min, space_max=space_max, iterations=10) # The optimization is run here

print(result)

Note that some functions have different definitions dependant on variable type. For example, the Simulated Annealing algorithm has different functions for double, integer, and string inputs. The functions are named anneal_double, anneal_int, and anneal_string, respectively. Objective functions must take in inputs as a list, as shown above.

References:

Kirkpatrick, S., C. D. Gelatt, and M. P. Vecchi. 1983. “Optimization by Simulated Annealing.” Science 220 (4598): 671–80. https://doi.org/10.1126/science.220.4598.671.

Shuhei Watanabe, “Tree-Structured Parzen Estimator: Understanding Its Algorithm Components and Their Roles for Better Empirical Performance,” arXiv (Cornell University), January 1, 2023, https://doi.org/10.48550/arxiv.2304.11127.

Bergstra, James, Dan Yamins, and David D. Cox. “Making a Science of Model Search.” arXiv (Cornell University), January 1, 2012. https://doi.org/10.48550/arxiv.1209.5111.

Song, Jiaming, Lantao Yu, Willie Neiswanger, and Stefano Ermon. “A General Recipe for Likelihood-free Bayesian Optimization.” arXiv (Cornell University), January 1, 2022. https://doi.org/10.48550/arxiv.2206.13035.

Falkner, Stefan, Aaron Klein, and Frank Hutter. “BOHB: Robust and Efficient Hyperparameter Optimization at Scale.” arXiv (Cornell University), January 1, 2018. https://doi.org/10.48550/arxiv.1807.01774.

Baluja, Shumeet, and Rich Caruana. 1995. “Removing the Genetics From the Standard Genetic Algorithm.” In Elsevier eBooks, 38–46. https://doi.org/10.1016/b978-1-55860-377-6.50014-1.

Bonyadi, Mohammad Reza, and Zbigniew Michalewicz. 2017. “Particle Swarm Optimization for Single Objective Continuous Space Problems: A Review.” Evolutionary Computation 25 (1): 1–54. https://doi.org/10.1162/evco_r_00180.

More

Contact: varunpiram@gmail.com

https://github.com/varunpiram/quickopt

https://pypi.org/project/quickopt/

This project is licensed under the MIT License. See the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

quickopt-0.1.7.tar.gz (14.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

quickopt-0.1.7-cp39-cp39-macosx_10_9_x86_64.whl (344.7 kB view details)

Uploaded CPython 3.9macOS 10.9+ x86-64

File details

Details for the file quickopt-0.1.7.tar.gz.

File metadata

  • Download URL: quickopt-0.1.7.tar.gz
  • Upload date:
  • Size: 14.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.15

File hashes

Hashes for quickopt-0.1.7.tar.gz
Algorithm Hash digest
SHA256 b317624fb03f9dbcb801e9b4740a21fc74128a552ddb6610844d13eb0fe40f56
MD5 7afd180f08b34f80012939fa21d0ffb3
BLAKE2b-256 27cc75c9af9650787b45288456ddcc29dfc95e3672a623e103f4aa22a328626a

See more details on using hashes here.

File details

Details for the file quickopt-0.1.7-cp39-cp39-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for quickopt-0.1.7-cp39-cp39-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 0fa815622888727d8bb9d8ae0e3101e9d8f1ace0c0d4c9a8de15a468659f7cf2
MD5 da09a3c8a629fb09c7a71e361e9e16eb
BLAKE2b-256 ec4d17473b1218ba4c2a50afaf5e022a4f0bcb00b0d08035da8c07a6cf7f38aa

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page