PyTorch-like neural networks in JAX
Project description
Equinox
Equinox is a JAX library based around a simple idea: represent parameterised functions (such as neural networks) as PyTrees.
In doing so:
- We get a PyTorch-like API...
- ...that's fully compatible with native JAX transformations...
- ...with no new concepts you have to learn. (It's all just PyTrees.)
The elegance of Equinox is its selling point in a world that already has Haiku, Flax and so on.
(In other words, why should you care? Because Equinox is really simple to learn, and really simple to use.)
Installation
pip install equinox
Requires Python 3.7+ and JAX 0.3.4+.
Documentation
Available at https://docs.kidger.site/equinox.
Quick example
Models are defined using PyTorch-like syntax:
import equinox as eqx
import jax
class Linear(eqx.Module):
weight: jax.numpy.ndarray
bias: jax.numpy.ndarray
def __init__(self, in_size, out_size, key):
wkey, bkey = jax.random.split(key)
self.weight = jax.random.normal(wkey, (out_size, in_size))
self.bias = jax.random.normal(bkey, (out_size,))
def __call__(self, x):
return self.weight @ x + self.bias
and fully compatible with normal JAX operations:
@jax.jit
@jax.grad
def loss_fn(model, x, y):
pred_y = jax.vmap(model)(x)
return jax.numpy.mean((y - pred_y) ** 2)
batch_size, in_size, out_size = 32, 2, 3
model = Linear(in_size, out_size, key=jax.random.PRNGKey(0))
x = jax.numpy.zeros((batch_size, in_size))
y = jax.numpy.zeros((batch_size, out_size))
grads = loss_fn(model, x, y)
Finally, there's no magic behind the scenes. All eqx.Module
does is register your class as a PyTree. From that point onwards, JAX already knows how to work with PyTrees.
Citation
If you found this library to be useful in academic work, then please cite: (arXiv link)
@article{kidger2021equinox,
author={Patrick Kidger and Cristian Garcia},
title={{E}quinox: neural networks in {JAX} via callable {P}y{T}rees and filtered transformations},
year={2021},
journal={Differentiable Programming workshop at Neural Information Processing Systems 2021}
}
(Also consider starring the project on GitHub.)
See also
Numerical differential equation solvers: Diffrax.
Type annotations and runtime checking for PyTrees and shape/dtype of JAX arrays: jaxtyping.
SymPy<->JAX conversion; train symbolic expressions via gradient descent: sympy2jax.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file equinox-0.7.0.tar.gz
.
File metadata
- Download URL: equinox-0.7.0.tar.gz
- Upload date:
- Size: 57.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.9.13
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 9f831ca3bdefcaf599522a832202a0fef4e9c218eae1cdd5a7f4821a1b136ff1 |
|
MD5 | 5c63af11b4e6b5c7d2495478e8548034 |
|
BLAKE2b-256 | eb288632c3504e54321d115ed9637547af2f52c80fc038562e888c750161fa83 |
File details
Details for the file equinox-0.7.0-py3-none-any.whl
.
File metadata
- Download URL: equinox-0.7.0-py3-none-any.whl
- Upload date:
- Size: 68.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.9.13
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 9f1b5f3fe900dc72cc12778210105cc11805da9355345e17bbe6b6b9ac1fd73b |
|
MD5 | eed15b4f5fe5d25b1eb9dfd3560214cc |
|
BLAKE2b-256 | e6bae21ca8b6a76ec04a79c51100266614109aff7bae935ab0f507c08d9f3134 |