Skip to main content

Elegant easy-to-use neural networks in JAX.

Reason this release was yanked:

buffer.at[not-an-integer] does not work

Project description

Equinox

Equinox is a JAX library for parameterised functions (e.g. neural networks) offering:

  • a PyTorch-like API...
  • ...that's fully compatible with native JAX transformations...
  • ...with no new concepts you have to learn.

If you're completely new to JAX, then start with this CNN on MNIST example.
If you're already familiar with JAX, then the main idea is to represent parameterised functions (such as neural networks) as PyTrees, so that they can pass across JIT/grad/etc. boundaries smoothly.

The elegance of Equinox is its selling point in a world that already has Haiku, Flax and so on.

In other words, why should you care? Because Equinox is really simple to learn, and really simple to use.

Installation

pip install equinox

Requires Python 3.9+ and JAX 0.4.13+.

Documentation

Available at https://docs.kidger.site/equinox.

Quick example

Models are defined using PyTorch-like syntax:

import equinox as eqx
import jax

class Linear(eqx.Module):
    weight: jax.Array
    bias: jax.Array

    def __init__(self, in_size, out_size, key):
        wkey, bkey = jax.random.split(key)
        self.weight = jax.random.normal(wkey, (out_size, in_size))
        self.bias = jax.random.normal(bkey, (out_size,))

    def __call__(self, x):
        return self.weight @ x + self.bias

and fully compatible with normal JAX operations:

@jax.jit
@jax.grad
def loss_fn(model, x, y):
    pred_y = jax.vmap(model)(x)
    return jax.numpy.mean((y - pred_y) ** 2)

batch_size, in_size, out_size = 32, 2, 3
model = Linear(in_size, out_size, key=jax.random.PRNGKey(0))
x = jax.numpy.zeros((batch_size, in_size))
y = jax.numpy.zeros((batch_size, out_size))
grads = loss_fn(model, x, y)

Finally, there's no magic behind the scenes. All eqx.Module does is register your class as a PyTree. From that point onwards, JAX already knows how to work with PyTrees.

Citation

If you found this library to be useful in academic work, then please cite: (arXiv link)

@article{kidger2021equinox,
    author={Patrick Kidger and Cristian Garcia},
    title={{E}quinox: neural networks in {JAX} via callable {P}y{T}rees and filtered transformations},
    year={2021},
    journal={Differentiable Programming workshop at Neural Information Processing Systems 2021}
}

(Also consider starring the project on GitHub.)

See also: other libraries in the JAX ecosystem

Optax: first-order gradient (SGD, Adam, ...) optimisers.

Diffrax: numerical differential equation solvers.

Lineax: linear solvers and linear least squares.

jaxtyping: type annotations for shape/dtype of arrays.

Eqxvision: computer vision models.

sympy2jax: SymPy<->JAX conversion; train symbolic expressions via gradient descent.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

equinox-0.10.8.tar.gz (100.6 kB view details)

Uploaded Source

Built Distribution

equinox-0.10.8-py3-none-any.whl (130.2 kB view details)

Uploaded Python 3

File details

Details for the file equinox-0.10.8.tar.gz.

File metadata

  • Download URL: equinox-0.10.8.tar.gz
  • Upload date:
  • Size: 100.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.11.4

File hashes

Hashes for equinox-0.10.8.tar.gz
Algorithm Hash digest
SHA256 83831e01eecb82512bebbaca9879a24859431e8f2c7650bf7edac38fb0c6dbba
MD5 69c7e00dd4dfb4e8f770d2f75631349e
BLAKE2b-256 f9039af8ab1fe724d42a6ace2a65797e0c2492c4d737b0b53da59b3c9dfe0b15

See more details on using hashes here.

File details

Details for the file equinox-0.10.8-py3-none-any.whl.

File metadata

  • Download URL: equinox-0.10.8-py3-none-any.whl
  • Upload date:
  • Size: 130.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.11.4

File hashes

Hashes for equinox-0.10.8-py3-none-any.whl
Algorithm Hash digest
SHA256 afb7e6ef3ea2c1447d0d63feb2134b5e99da1de2e4499d29d1b250a352a97081
MD5 27ed647f26a5b0cb024fd2807e01f6bb
BLAKE2b-256 4764634a7f0e77919fa21622a3e036d054e924a4a8a9a43ab712f599427c6290

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page