Skip to main content

Elegant easy-to-use neural networks in JAX.

Project description

Equinox

Equinox is your one-stop JAX library, for everything you need that isn't already in core JAX:

  • neural networks (or more generally any model), with easy-to-use PyTorch-like syntax;
  • filtered APIs for transformations;
  • useful PyTree manipulation routines;
  • advanced features like runtime errors;

and best of all, Equinox isn't a framework: everything you write in Equinox is compatible with anything else in JAX or the ecosystem.

If you're completely new to JAX, then start with this CNN on MNIST example.

Coming from Flax or Haiku? The main difference is that Equinox (a) offers a lot of advanced features not found in these libraries, like PyTree manipulation or runtime errors; (b) has a simpler way of building models: they're just PyTrees, so they can pass across JIT/grad/etc. boundaries smoothly.

Installation

pip install equinox

Requires Python 3.9+ and JAX 0.4.13+.

Documentation

Available at https://docs.kidger.site/equinox.

Quick example

Models are defined using PyTorch-like syntax:

import equinox as eqx
import jax

class Linear(eqx.Module):
    weight: jax.Array
    bias: jax.Array

    def __init__(self, in_size, out_size, key):
        wkey, bkey = jax.random.split(key)
        self.weight = jax.random.normal(wkey, (out_size, in_size))
        self.bias = jax.random.normal(bkey, (out_size,))

    def __call__(self, x):
        return self.weight @ x + self.bias

and fully compatible with normal JAX operations:

@jax.jit
@jax.grad
def loss_fn(model, x, y):
    pred_y = jax.vmap(model)(x)
    return jax.numpy.mean((y - pred_y) ** 2)

batch_size, in_size, out_size = 32, 2, 3
model = Linear(in_size, out_size, key=jax.random.PRNGKey(0))
x = jax.numpy.zeros((batch_size, in_size))
y = jax.numpy.zeros((batch_size, out_size))
grads = loss_fn(model, x, y)

Finally, there's no magic behind the scenes. All eqx.Module does is register your class as a PyTree. From that point onwards, JAX already knows how to work with PyTrees.

Citation

If you found this library to be useful in academic work, then please cite: (arXiv link)

@article{kidger2021equinox,
    author={Patrick Kidger and Cristian Garcia},
    title={{E}quinox: neural networks in {JAX} via callable {P}y{T}rees and filtered transformations},
    year={2021},
    journal={Differentiable Programming workshop at Neural Information Processing Systems 2021}
}

(Also consider starring the project on GitHub.)

See also: other libraries in the JAX ecosystem

Optax: first-order gradient (SGD, Adam, ...) optimisers.

Diffrax: numerical differential equation solvers.

Lineax: linear solvers and linear least squares.

jaxtyping: type annotations for shape/dtype of arrays.

Eqxvision: computer vision models.

sympy2jax: SymPy<->JAX conversion; train symbolic expressions via gradient descent.

Levanter: scalable+reliable training of foundation models (e.g. LLMs).

Disclaimer

Equinox is maintained by Patrick Kidger at Google X, but this is not an official Google product.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

equinox-0.11.0.tar.gz (120.9 kB view details)

Uploaded Source

Built Distribution

equinox-0.11.0-py3-none-any.whl (155.9 kB view details)

Uploaded Python 3

File details

Details for the file equinox-0.11.0.tar.gz.

File metadata

  • Download URL: equinox-0.11.0.tar.gz
  • Upload date:
  • Size: 120.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.5

File hashes

Hashes for equinox-0.11.0.tar.gz
Algorithm Hash digest
SHA256 b47c38c5057639b42828b8d86ae831a77af24f6f53d5f99a358806383a3239b2
MD5 1ccb8973097aa05e65ad9acf24257471
BLAKE2b-256 3b95781da3940c211f5706f5965dc2d4e9f5067b12cc3ab320da643581860690

See more details on using hashes here.

File details

Details for the file equinox-0.11.0-py3-none-any.whl.

File metadata

  • Download URL: equinox-0.11.0-py3-none-any.whl
  • Upload date:
  • Size: 155.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.5

File hashes

Hashes for equinox-0.11.0-py3-none-any.whl
Algorithm Hash digest
SHA256 9aa5ae402fb63865f5e596fea6d839801481750567dc87803b531fb8da74ba7d
MD5 96bd0aa98f694d738836c68e9456aba1
BLAKE2b-256 98047a0da376fa36c4927ee1c42b76537174eca8c80442f7d4c64b45dcb81fa3

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page