Skip to main content

A drop-in replacement for scipy optimize functions with quality of life improvements

Project description

Better Optimization!

better_optimize is a friendlier front-end to scipy's optimize.minimize and optimize.root functions. Features include:

  • Progress bar!
  • Early stopping!
  • Better propagation of common arguments (maxiters, tol)!

Installation

To install better_optimize, simply use conda:

conda install -c conda-forge better_optimize

Or, if you prefer pip:

pip install better_optimize

What does better_optimize provide over basic scipy?

1. Progress Bars

All optimization routines in better_optimize can display a rich, informative progress bar using the rich library. This includes:

  • Iteration counts, elapsed time, and objective values.
  • Gradient and Hessian norms (when available).
  • Separate progress bars for global (basinhopping) and local (minimizer) steps.
  • Toggleable display for headless or script environments.

2. Flat and Generalized Keyword Arguments

  • No more nested options dictionaries! You can pass tol, maxiter, and other common options directly as top-level keyword arguments.
  • better_optimize automatically sorts and promotes these arguments to the correct place for each optimizer.
  • Generalizes argument handling: always provides tol and maxiter (or their equivalents) to the optimizer, even if you forget.

3. Argument Checking and Validation

  • Automatic checking of provided gradient (jac), Hessian (hess), and Hessian-vector (hessp) functions.
  • Warns if you provide unnecessary or unused arguments for a given method.
  • Detects and handles fused objective functions (e.g., functions returning (loss, grad) or (loss, grad, hess) tuples).
  • Ensures that the correct function signatures and return types are used for each optimizer.

4. LRUCache1 for Fused Functions

  • Provides an LRUCache1 utility to cache the results of expensive objective/gradient/Hessian computations.
  • Especially useful for triple-fused functions that return value, gradient, and Hessian together, avoiding redundant computation.
  • Totally invisible -- just pass a function with 3 return values. Seamlessly integrated into the optimization workflow.

5. Robust Basin-Hopping with Failure Tolerance

  • Enhanced basinhopping implementation allows you to continue even if the local minimizer fails.
  • Optionally accepts and stores failed minimizer results if they improve the global minimum.
  • Useful for noisy or non-smooth objective functions where local minimization may occasionally fail.

Example Usage

Simple Example

from better_optimize import minimize

def rosenbrock(x):
    return sum(100.0*(x[1:] - x[:-1]**2.0)**2.0 + (1 - x[:-1])**2.0)

result = minimize(
    rosenbrock,
    x0=[-1, 2],
    method="L-BFGS-B",
    tol=1e-6,
    maxiter=1000,
    progressbar=True,  # Show a rich progress bar!
)
  Minimizing                                         Elapsed   Iteration   Objective    ||grad||
 ──────────────────────────────────────────────────────────────────────────────────────────────────
  ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━   0:00:00   721/721     0.34271757   0.92457651

The result object is a standard OptimizeResult from scipy.optimize, so there are no surprises there!

Triple-Fused Function using Pytensor

from better_optimize import minimize
import pytensor.tensor as pt
from pytensor import function
import numpy as np

x = pt.vector('x')
value = pt.sum(100.0*(x[1:] - x[:-1]**2.0)**2.0 + (1 - x[:-1])**2.0)
grad = pt.grad(value, x)
hess = pt.hessian(value, x)

fused_fn = function([x], [value, grad, hess])
x0 = np.array([1.3, 0.7, 0.8, 1.9, 1.2])

result = minimize(
    fused_fn, # No need to set flags separately, `better_optimize` handles it!
    x0=x0,
    method="Newton-CG",
    tol=1e-6,
    maxiter=1000,
    progressbar=True,  # Show a rich progress bar!
)

Many sub-computations are repeated between the objective, gradient, and hessian functions. Scipy allows you to pass a fused value_and_grad function, but better_optimize also lets you pass a triple-fused value_grad_and_hess function. This avoids redundant computation and speeds up the optimization process.

Contributing

We welcome contributions! If you find a bug, have a feature request, or want to improve the documentation, please open an issue or submit a pull request on GitHub.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

better_optimize-0.1.3.tar.gz (19.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

better_optimize-0.1.3-py3-none-any.whl (20.0 kB view details)

Uploaded Python 3

File details

Details for the file better_optimize-0.1.3.tar.gz.

File metadata

  • Download URL: better_optimize-0.1.3.tar.gz
  • Upload date:
  • Size: 19.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for better_optimize-0.1.3.tar.gz
Algorithm Hash digest
SHA256 6af1bd50e6693500f4ea1e625c08bbeaccb515bd13c100c7591274270d4e3aa4
MD5 c1e27a1b560fa6cb4b73bd72b3a2c104
BLAKE2b-256 c11dbdc768318b7610ce5c08468e945fcce35495d39ab4aa7e6f676cacb4dc3f

See more details on using hashes here.

Provenance

The following attestation bundles were made for better_optimize-0.1.3.tar.gz:

Publisher: release.yml on jessegrabowski/better_optimize

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file better_optimize-0.1.3-py3-none-any.whl.

File metadata

File hashes

Hashes for better_optimize-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 18ecce45a4dd39d1285639e8d11a008c72cb6b1feb8c473746149fb61cc7f300
MD5 15269c82a37a5ff66eb2f6baa4b0ad86
BLAKE2b-256 674819d0dfd8af07ba804ac00b53ff1b87f774198d6c50a4ae14117c38f9df09

See more details on using hashes here.

Provenance

The following attestation bundles were made for better_optimize-0.1.3-py3-none-any.whl:

Publisher: release.yml on jessegrabowski/better_optimize

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page