Skip to main content

A drop-in replacement for scipy optimize functions with quality of life improvements

Project description

Better Optimization!

better_optimize is a friendlier front-end to scipy's optimize.minimize and optimize.root functions. Features include:

  • Progress bar!
  • Early stopping!
  • Better propagation of common arguments (maxiters, tol)!

Installation

To install better_optimize, simply use conda:

conda install -c conda-forge better_optimize

Or, if you prefer pip:

pip install better_optimize

What does better_optimize provide over basic scipy?

1. Progress Bars

All optimization routines in better_optimize can display a rich, informative progress bar using the rich library. This includes:

  • Iteration counts, elapsed time, and objective values.
  • Gradient and Hessian norms (when available).
  • Separate progress bars for global (basinhopping) and local (minimizer) steps.
  • Toggleable display for headless or script environments.

2. Flat and Generalized Keyword Arguments

  • No more nested options dictionaries! You can pass tol, maxiter, and other common options directly as top-level keyword arguments.
  • better_optimize automatically sorts and promotes these arguments to the correct place for each optimizer.
  • Generalizes argument handling: always provides tol and maxiter (or their equivalents) to the optimizer, even if you forget.

3. Argument Checking and Validation

  • Automatic checking of provided gradient (jac), Hessian (hess), and Hessian-vector (hessp) functions.
  • Warns if you provide unnecessary or unused arguments for a given method.
  • Detects and handles fused objective functions (e.g., functions returning (loss, grad) or (loss, grad, hess) tuples).
  • Ensures that the correct function signatures and return types are used for each optimizer.

4. LRUCache1 for Fused Functions

  • Provides an LRUCache1 utility to cache the results of expensive objective/gradient/Hessian computations.
  • Especially useful for triple-fused functions that return value, gradient, and Hessian together, avoiding redundant computation.
  • Totally invisible -- just pass a function with 3 return values. Seamlessly integrated into the optimization workflow.

5. Robust Basin-Hopping with Failure Tolerance

  • Enhanced basinhopping implementation allows you to continue even if the local minimizer fails.
  • Optionally accepts and stores failed minimizer results if they improve the global minimum.
  • Useful for noisy or non-smooth objective functions where local minimization may occasionally fail.

Example Usage

Simple Example

from better_optimize import minimize

def rosenbrock(x):
    return sum(100.0*(x[1:] - x[:-1]**2.0)**2.0 + (1 - x[:-1])**2.0)

result = minimize(
    rosenbrock,
    x0=[-1, 2],
    method="L-BFGS-B",
    tol=1e-6,
    maxiter=1000,
    progressbar=True,  # Show a rich progress bar!
)
  Minimizing                                         Elapsed   Iteration   Objective    ||grad||
 ──────────────────────────────────────────────────────────────────────────────────────────────────
  ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━   0:00:00   721/721     0.34271757   0.92457651

The result object is a standard OptimizeResult from scipy.optimize, so there are no surprises there!

Triple-Fused Function using Pytensor

from better_optimize import minimize
import pytensor.tensor as pt
from pytensor import function
import numpy as np

x = pt.vector('x')
value = pt.sum(100.0*(x[1:] - x[:-1]**2.0)**2.0 + (1 - x[:-1])**2.0)
grad = pt.grad(value, x)
hess = pt.hessian(value, x)

fused_fn = function([x], [value, grad, hess])
x0 = np.array([1.3, 0.7, 0.8, 1.9, 1.2])

result = minimize(
    fused_fn, # No need to set flags separately, `better_optimize` handles it!
    x0=x0,
    method="Newton-CG",
    tol=1e-6,
    maxiter=1000,
    progressbar=True,  # Show a rich progress bar!
)

Many sub-computations are repeated between the objective, gradient, and hessian functions. Scipy allows you to pass a fused value_and_grad function, but better_optimize also lets you pass a triple-fused value_grad_and_hess function. This avoids redundant computation and speeds up the optimization process.

Contributing

We welcome contributions! If you find a bug, have a feature request, or want to improve the documentation, please open an issue or submit a pull request on GitHub.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

better_optimize-0.1.6.tar.gz (19.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

better_optimize-0.1.6-py3-none-any.whl (20.1 kB view details)

Uploaded Python 3

File details

Details for the file better_optimize-0.1.6.tar.gz.

File metadata

  • Download URL: better_optimize-0.1.6.tar.gz
  • Upload date:
  • Size: 19.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for better_optimize-0.1.6.tar.gz
Algorithm Hash digest
SHA256 7fedab9769cb2a0bd27a6a7b0a660a625d8f139d589d27d35201dfdf7c59bd44
MD5 bd6d02738f9c4a7cad2cb38bb557e0fb
BLAKE2b-256 b2b48e4da1ab8ca60430a74ab19e7be95cec4ca39831e8665e0241c2b4f780ee

See more details on using hashes here.

Provenance

The following attestation bundles were made for better_optimize-0.1.6.tar.gz:

Publisher: release.yml on jessegrabowski/better_optimize

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file better_optimize-0.1.6-py3-none-any.whl.

File metadata

File hashes

Hashes for better_optimize-0.1.6-py3-none-any.whl
Algorithm Hash digest
SHA256 bc409f2ee726765c79134ead75815c8309180eba367eab9286ddb8c90fa1bfa9
MD5 17834bcaca58bca7075ca8178e411f35
BLAKE2b-256 5815915ceb4340a3c1a5d3bc8349c525fa42592614b68a249250eb9be3f7e5a1

See more details on using hashes here.

Provenance

The following attestation bundles were made for better_optimize-0.1.6-py3-none-any.whl:

Publisher: release.yml on jessegrabowski/better_optimize

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page