Skip to main content

A Python optimization toolkit focused on global optimization problems, featuring simple and customizable setup of various optimization algorithms implemented in C++.

Project description

quickopt

Downloads

A Python optimization toolkit developed in C++ geared towards multimodal functions, including efficient implementations of Bayesian Optimization with Tree-structured Parzen estimators, Particle Swarm Optimization, and more.

Documentation: Click Here!

Most of the information is found in the documentation - please refer to it! Tutorials, examples, descriptions/breakdowns of algorithms, and references are found here.

Installation

To install the package using PyPi, run the following command:

pip install quickopt

Description

This package is a Python optimization toolkit developed in C++ geared towards global optimization problems, such as those encountered in hyperparameter tuning. It contains a variety of optimization algorithms, which are listed below. The package is designed to be easy to use and flexible, allowing users to use popular optimization methods out of the box in as little as one line of code, while also allowing users to customize the optimization process to suit their needs. A list of the optimization algorithms included in the package is provided below:

  • Simulated Annealing: Compatible with string, float, and int inputs

  • Bayesian Optimization with Tree-Structured Parzen Estimators: Compatible with float inputs

  • Genetic Algorithm: Compatible with string, float, and int inputs

  • Particle Swarm Optimization: Compatible with float inputs

with more to come soon!

Quick Start

To use the package, import the desired optimization function from the package and run it with the desired parameters (described in detail in documentation). For example, to use the Particle Swarm Optimization algorithm for a double function, you would run the following code:

from quickopt.pso import pso
def objective(params): # The function being optimized - this must take inputs as a list
        return sum(x**2 for x in params)
    space_min = [-1.0, -1.0]
    space_max = [1.0, 1.0]

result = pso(funct=objective, space_min=space_min, space_max=space_max, iterations=10) # The optimization is run here

print(result)

Note that some functions have different definitions dependant on variable type. For example, the Simulated Annealing algorithm has different functions for double, integer, and string inputs. The functions are named anneal_double, anneal_int, and anneal_string, respectively. Objective functions must take in inputs as a list, as shown above.

References:

Kirkpatrick, S., C. D. Gelatt, and M. P. Vecchi. 1983. “Optimization by Simulated Annealing.” Science 220 (4598): 671–80. https://doi.org/10.1126/science.220.4598.671.

Shuhei Watanabe, “Tree-Structured Parzen Estimator: Understanding Its Algorithm Components and Their Roles for Better Empirical Performance,” arXiv (Cornell University), January 1, 2023, https://doi.org/10.48550/arxiv.2304.11127.

Bergstra, James, Dan Yamins, and David D. Cox. “Making a Science of Model Search.” arXiv (Cornell University), January 1, 2012. https://doi.org/10.48550/arxiv.1209.5111.

Song, Jiaming, Lantao Yu, Willie Neiswanger, and Stefano Ermon. “A General Recipe for Likelihood-free Bayesian Optimization.” arXiv (Cornell University), January 1, 2022. https://doi.org/10.48550/arxiv.2206.13035.

Falkner, Stefan, Aaron Klein, and Frank Hutter. “BOHB: Robust and Efficient Hyperparameter Optimization at Scale.” arXiv (Cornell University), January 1, 2018. https://doi.org/10.48550/arxiv.1807.01774.

Baluja, Shumeet, and Rich Caruana. 1995. “Removing the Genetics From the Standard Genetic Algorithm.” In Elsevier eBooks, 38–46. https://doi.org/10.1016/b978-1-55860-377-6.50014-1.

Bonyadi, Mohammad Reza, and Zbigniew Michalewicz. 2017. “Particle Swarm Optimization for Single Objective Continuous Space Problems: A Review.” Evolutionary Computation 25 (1): 1–54. https://doi.org/10.1162/evco_r_00180.

Detailed references can be found within documentation.

More

Contact: varunpiram@gmail.com

https://github.com/varunpiram/quickopt

https://pypi.org/project/quickopt/

This project is licensed under the MIT License. See the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

quickopt-0.1.8.tar.gz (14.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

quickopt-0.1.8-cp39-cp39-macosx_10_9_x86_64.whl (344.8 kB view details)

Uploaded CPython 3.9macOS 10.9+ x86-64

File details

Details for the file quickopt-0.1.8.tar.gz.

File metadata

  • Download URL: quickopt-0.1.8.tar.gz
  • Upload date:
  • Size: 14.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.15

File hashes

Hashes for quickopt-0.1.8.tar.gz
Algorithm Hash digest
SHA256 a02612ed2faa9146a0664f2a0e0419eb0a8d675e0e4af5c0d158e6604cc0277a
MD5 d400aa556f15b355d51d35136cf3e223
BLAKE2b-256 25a46afe37ee8022ca5d1c523eae562e16d9d936aa326f6a359380a0446da350

See more details on using hashes here.

File details

Details for the file quickopt-0.1.8-cp39-cp39-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for quickopt-0.1.8-cp39-cp39-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 13c4e5de347453183bee8310154895eeaa0c34b38e3b9300cd162c07f6659466
MD5 360238560e8e50bc4f5693acefe0517e
BLAKE2b-256 6ecf506bdebe783c5d89a975e502a970483a514c283b5d49601a318833127aaa

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page