Nonlinear optimisation in JAX and Equinox.
Project description
Optimistix
Optimistix is a JAX library for nonlinear solvers: root finding, minimisation, fixed points, and least squares.
Features include:
- interoperable solvers: e.g. autoconvert root find problems to least squares problems, then solve using a minimisation algorithm.
- modular optimisers: e.g. use a BFGS quadratic bowl with a dogleg descent path with a trust region update.
- using a PyTree as the state.
- fast compilation and runtimes.
- interoperability with Optax.
- all the benefits of working with JAX: autodiff, autoparallelism, GPU/TPU support etc.
Installation
pip install optimistix
Requires Python 3.9+ and JAX 0.4.14+ and Equinox 0.11.0+.
Documentation
Available at https://docs.kidger.site/optimistix.
Quick example
import jax.numpy as jnp
import optimistix as optx
# Let's solve the ODE dy/dt=tanh(y(t)) with the implicit Euler method.
# We need to find y1 s.t. y1 = y0 + tanh(y1)dt.
y0 = jnp.array(1.)
dt = jnp.array(0.1)
def fn(y, args):
return y0 + jnp.tanh(y) * dt
solver = optx.Newton(rtol=1e-5, atol=1e-5)
sol = optx.fixed_point(fn, solver, y0)
y1 = sol.value # satisfies y1 == fn(y1)
Citation
If you found this library to be useful in academic work, then please cite: (arXiv link)
@article{optimistix2024,
title={Optimistix: modular optimisation in JAX and Equinox},
author={Jason Rader and Terry Lyons and Patrick Kidger},
journal={arXiv:2402.09983},
year={2024},
}
See also: other libraries in the JAX ecosystem
Always useful
Equinox: neural networks and everything not already in core JAX!
jaxtyping: type annotations for shape/dtype of arrays.
Deep learning
Optax: first-order gradient (SGD, Adam, ...) optimisers.
Orbax: checkpointing (async/multi-host/multi-device).
Levanter: scalable+reliable training of foundation models (e.g. LLMs).
Scientific computing
Diffrax: numerical differential equation solvers.
Lineax: linear solvers.
BlackJAX: probabilistic+Bayesian sampling.
sympy2jax: SymPy<->JAX conversion; train symbolic expressions via gradient descent.
PySR: symbolic regression. (Non-JAX honourable mention!)
Awesome JAX
Awesome JAX: a longer list of other JAX projects.
Credit
Optimistix was primarily built by Jason Rader (@packquickly): Twitter; GitHub; Website.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file optimistix-0.0.7.tar.gz
.
File metadata
- Download URL: optimistix-0.0.7.tar.gz
- Upload date:
- Size: 53.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.0.0 CPython/3.12.3
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 0c649293b01c2029a5f9878fbb912d4977105830b4368b0d44b710b621c49ac3 |
|
MD5 | 85bb0ea50abbb13a0cabead6cbe3a380 |
|
BLAKE2b-256 | b128a090023b9cf0be1359c7a14d41ba19a9b299783dc83ed1441eb812649f0f |
File details
Details for the file optimistix-0.0.7-py3-none-any.whl
.
File metadata
- Download URL: optimistix-0.0.7-py3-none-any.whl
- Upload date:
- Size: 81.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.0.0 CPython/3.12.3
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 7d1b1b3ab5462142d38f7e1171b597e3df591c7c33de43f85435dc7c0b9aa317 |
|
MD5 | 9c8787fa4b93a7a3a90f677c3d23affa |
|
BLAKE2b-256 | edc4a118bcb37ee32af33b4260617936bb5f0fbe10b81b9fc9f79adfb6fb8b4c |