Skip to main content

Elegant easy-to-use neural networks in JAX.

Project description

Equinox

Equinox is your one-stop JAX library, for everything you need that isn't already in core JAX:

  • neural networks (or more generally any model), with easy-to-use PyTorch-like syntax;
  • filtered APIs for transformations;
  • useful PyTree manipulation routines;
  • advanced features like runtime errors;

and best of all, Equinox isn't a framework: everything you write in Equinox is compatible with anything else in JAX or the ecosystem.

If you're completely new to JAX, then start with this CNN on MNIST example.

Coming from Flax or Haiku? The main difference is that Equinox (a) offers a lot of advanced features not found in these libraries, like PyTree manipulation or runtime errors; (b) has a simpler way of building models: they're just PyTrees, so they can pass across JIT/grad/etc. boundaries smoothly.

Installation

Requires Python 3.10+.

pip install equinox

Equinox is also available through a community-supported build on conda-forge.

Documentation

Available at https://docs.kidger.site/equinox.

Quick example

Models are defined using PyTorch-like syntax:

import equinox as eqx
import jax

class Linear(eqx.Module):
    weight: jax.Array
    bias: jax.Array

    def __init__(self, in_size, out_size, key):
        wkey, bkey = jax.random.split(key)
        self.weight = jax.random.normal(wkey, (out_size, in_size))
        self.bias = jax.random.normal(bkey, (out_size,))

    def __call__(self, x):
        return self.weight @ x + self.bias

and are fully compatible with normal JAX operations:

@jax.jit
@jax.grad
def loss_fn(model, x, y):
    pred_y = jax.vmap(model)(x)
    return jax.numpy.mean((y - pred_y) ** 2)

batch_size, in_size, out_size = 32, 2, 3
model = Linear(in_size, out_size, key=jax.random.PRNGKey(0))
x = jax.numpy.zeros((batch_size, in_size))
y = jax.numpy.zeros((batch_size, out_size))
grads = loss_fn(model, x, y)

Finally, there's no magic behind the scenes. All eqx.Module does is register your class as a PyTree. From that point onwards, JAX already knows how to work with PyTrees.

Citation

If you found this library to be useful in academic work, then please cite: (arXiv link)

@article{kidger2021equinox,
    author={Patrick Kidger and Cristian Garcia},
    title={{E}quinox: neural networks in {JAX} via callable {P}y{T}rees and filtered transformations},
    year={2021},
    journal={Differentiable Programming workshop at Neural Information Processing Systems 2021}
}

(Also consider starring the project on GitHub.)

See also: other libraries in the JAX ecosystem

Always useful
jaxtyping: type annotations for shape/dtype of arrays.

Deep learning
Optax: first-order gradient (SGD, Adam, ...) optimisers.
Orbax: checkpointing (async/multi-host/multi-device).
Levanter: scalable+reliable training of foundation models (e.g. LLMs).
paramax: parameterizations and constraints for PyTrees.

Scientific computing
Diffrax: numerical differential equation solvers.
Optimistix: root finding, minimisation, fixed points, and least squares.
Lineax: linear solvers.
BlackJAX: probabilistic+Bayesian sampling.
sympy2jax: SymPy<->JAX conversion; train symbolic expressions via gradient descent.
PySR: symbolic regression. (Non-JAX honourable mention!)

Awesome JAX
Awesome Equinox
Awesome JAX: a longer list of other JAX projects.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

equinox-0.13.4.tar.gz (141.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

equinox-0.13.4-py3-none-any.whl (181.2 kB view details)

Uploaded Python 3

File details

Details for the file equinox-0.13.4.tar.gz.

File metadata

  • Download URL: equinox-0.13.4.tar.gz
  • Upload date:
  • Size: 141.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for equinox-0.13.4.tar.gz
Algorithm Hash digest
SHA256 d4eed5d7f981a5ddcb7bc70e601707769fb4da20f777703cc6e01a6248af9758
MD5 c6be012f1ac85ee1dd3d4e4baade3d75
BLAKE2b-256 2d801704c3375a9c3cae4713a1be12b59dbd8ddff03808862264be622b333cd1

See more details on using hashes here.

File details

Details for the file equinox-0.13.4-py3-none-any.whl.

File metadata

  • Download URL: equinox-0.13.4-py3-none-any.whl
  • Upload date:
  • Size: 181.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for equinox-0.13.4-py3-none-any.whl
Algorithm Hash digest
SHA256 8203b6b84c1641a58aad07c040c171fc5a38e2c3bc90c5feb2f221d398354378
MD5 00c5e4dd0dcbd912eeadade4ed8837e4
BLAKE2b-256 964ceea4860010ba31d6793038073c66f2e1755c80d0adb704c4926d549e724a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page