Skip to main content

A package doing simulated and quantum annealing

Project description

doc License: GPL v3

Status

Pytests push

maintained issues pr

Compatibilities

ubuntu unix

python

Contact

linkedin website mail

AdAnnealing

A package doing simulated annealing

Installation

git clone https://github.com/pcotteadvestis/adannealing
cd adannealing
pip install .

Usage

Simple usage :

from adannealing import Annealer

def loss_func_2d(w) -> float:
    x = w[0]
    y = w[1]
    return (x - 5) * (x - 2) * (x - 1) * x + 10 * y ** 2

init_states, bounds, acceptance = (3.0, 0.5), ((0, 5), (-1, 1)), 0.01

ann = Annealer(
    loss=loss_func_2d,
    weights_step_size=0.1,
    init_states=init_states,  # Optionnal
    bounds=bounds,
    verbose=True
)

# Weights of local minimum, and loss at local minimum
w0, lmin, _, _, _, _ = ann.fit(stopping_limit=acceptance)

Use multiple initial states in parallel runs and get one output per init states :

from adannealing import Annealer

Annealer.set_parallel()


def loss_func_2d(w) -> float:
    x = w[0]
    y = w[1]
    return (x - 5) * (x - 2) * (x - 1) * x + 10 * y ** 2

bounds, acceptance, n = ((0, 5), (-1, 1)), 0.01, 5

ann = Annealer(
    loss=loss_func_2d,
    weights_step_size=0.1,
    bounds=bounds,
    verbose=True
)

# Iterable of n weights of local minimum and loss at local minimum
results = ann.fit(npoints=n, stopping_limit=acceptance)
for w0, lmin, _, _, _, _ in results:
    """do something"""

Use multiple initial states in parallel runs and get the result with the smallest loss :

from adannealing import Annealer

Annealer.set_parallel()


def loss_func_2d(w) -> float:
    x = w[0]
    y = w[1]
    return (x - 5) * (x - 2) * (x - 1) * x + 10 * y ** 2


bounds, acceptance, n = ((0, 5), (-1, 1)), 0.01, 5

ann = Annealer(
    loss=loss_func_2d,
    weights_step_size=0.1,
    bounds=bounds,
    verbose=True
)

# Weights of the best local minimum and loss at the best local minimum
w0, lmin, _, _, _, _ = ann.fit(npoints=n, stopping_limit=acceptance, stop_at_first_found=True)

One can save the history of the learning by giving a path :

from adannealing import Annealer

Annealer.set_parallel()


def loss_func_2d(w) -> float:
    x = w[0]
    y = w[1]
    return (x - 5) * (x - 2) * (x - 1) * x + 10 * y ** 2


bounds, acceptance, n = ((0, 5), (-1, 1)), 0.01, 5

ann = Annealer(
    loss=loss_func_2d,
    weights_step_size=0.1,
    bounds=bounds,
    verbose=True
)

# Weights of the best local minimum and loss at the best local minimum
w0, lmin, _, _, _, _ = ann.fit(
    npoints=n,
    stopping_limit=acceptance,
    history_path="logs"
)

In this example, calling fit will produce n directories in logs, each containing 2 files: history.csv and returns.csv. The first is the entier history of the fit, the second is only the iteration that found the local minimum. If only one point is asked (either by using npoints=1 or stop_at_first_found=True), will produce history.csv and returns.csv directly in logs, and will delete the subfolders of the runs that did not produce the local minimum.

One can plot the result of a fit by doing

from adannealing import plot

# figure will be saved in logs/annealing.pdf
fig = plot("logs", nweights=2, weights_names=["A", "B", "C"])

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

adannealing-0.1.15.tar.gz (46.2 MB view details)

Uploaded Source

File details

Details for the file adannealing-0.1.15.tar.gz.

File metadata

  • Download URL: adannealing-0.1.15.tar.gz
  • Upload date:
  • Size: 46.2 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/34.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.9 tqdm/4.63.1 importlib-metadata/4.11.3 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.9.12

File hashes

Hashes for adannealing-0.1.15.tar.gz
Algorithm Hash digest
SHA256 38b0f20bc33ab6bbbe10f7623575186db7d3e1b07362a1e3374397fafb9e988d
MD5 ae46e28a2f3b8927cff67fa04404e9dc
BLAKE2b-256 e7f36a5598e41a42e425b75ce153517b1778ce21c616771f94a49fb593fce74f

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page