Skip to main content

Multiple dispatch in JAX via custom interpreters.

Project description

Quax

JAX + multiple dispatch + custom array-ish objects

For example, this can be mean overloading matrix multiplication to exploit sparsity or structure, or automatically rewriting a LoRA's matmul (W + AB)v into the more-efficient Wv + ABv.

Applications include:

  • LoRA weight matrices
  • symbolic zeros
  • arrays with named dimensions
  • structured (e.g. tridiagonal) matrices
  • sparse arrays
  • quantised arrays
  • arrays with physical units attached
  • etc! (See the built-in quax.examples library for most of the above!)

This works via a custom JAX transform. Take an existing JAX program, wrap it in a quax.quaxify, and then pass in the custom array-ish objects. This means it will work even with existing programs, that were not written to accept such array-ish objects!

(Just like how jax.vmap takes a program, but reinterprets each operation as its batched version, so to will quax.quaxify take a program and reinterpret each operation according to what array-ish types are passed.)

Installation

pip install quax

Documentation

Available at https://docs.kidger.site/quax.

Example: LoRA

This example demonstrates everything you need to use the built-in quax.examples.lora library.

import equinox as eqx
import jax.random as jr
import quax
import quax.examples.lora as lora

#
# Start off with any JAX program: here, the forward pass through a linear layer.
#

key1, key2, key3 = jr.split(jr.PRNGKey(0), 3)
linear = eqx.nn.Linear(10, 12, key=key1)
vector = jr.normal(key2, (10,))

def run(model, x):
  return model(x)

run(linear, vector)  # can call this as normal

#
# Now let's Lora-ify it.
#

# Step 1: make the weight be a LoraArray.
lora_weight = lora.LoraArray(linear.weight, rank=2, key=key3)
lora_linear = eqx.tree_at(lambda l: l.weight, linear, lora_weight)
# Step 2: quaxify and call the original function. The transform will call the
# original function, whilst looking up any multiple dispatch rules registered.
# (In this case for doing matmuls against LoraArrays.)
quax.quaxify(run)(lora_linear, vector)
# Appendix: Quax includes a helper to automatically apply Step 1 to all
# `eqx.nn.Linear` layers in a model.
lora_linear = lora.loraify(linear, rank=2, key=key3)

Work in progress!

Right now, the following are not supported:

  • Control flow primitives (e.g. jax.lax.cond).
  • jax.custom_vjp

It should be fairly straightforward to add support for these; open an issue or pull request.

See also: other libraries in the JAX ecosystem

Equinox: neural networks.

jaxtyping: type annotations for shape/dtype of arrays.

Optax: first-order gradient (SGD, Adam, ...) optimisers.

Diffrax: numerical differential equation solvers.

Optimistix: root finding, minimisation, fixed points, and least squares.

Lineax: linear solvers.

BlackJAX: probabilistic+Bayesian sampling.

Orbax: checkpointing (async/multi-host/multi-device).

sympy2jax: SymPy<->JAX conversion; train symbolic expressions via gradient descent.

Eqxvision: computer vision models.

Levanter: scalable+reliable training of foundation models (e.g. LLMs).

PySR: symbolic regression. (Non-JAX honourable mention!)

Acknowledgements

Significantly inspired by https://github.com/davisyoshida/qax, https://github.com/stanford-crfm/levanter, and jax.experimental.sparse.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

quax-0.0.4.tar.gz (24.7 kB view details)

Uploaded Source

Built Distribution

quax-0.0.4-py3-none-any.whl (35.1 kB view details)

Uploaded Python 3

File details

Details for the file quax-0.0.4.tar.gz.

File metadata

  • Download URL: quax-0.0.4.tar.gz
  • Upload date:
  • Size: 24.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.12.4

File hashes

Hashes for quax-0.0.4.tar.gz
Algorithm Hash digest
SHA256 e5aaf7f811291e2e9ca75d7e95f92e973315b3cdf6f96f6ecc3c02141415f6a0
MD5 0da3ca16dc54d2598b6214e2eac0d6c1
BLAKE2b-256 8c199fe76143a0ad5c3bfbf4064c5cc9a87fce76a25f01bd8a4edc33697e6e19

See more details on using hashes here.

File details

Details for the file quax-0.0.4-py3-none-any.whl.

File metadata

  • Download URL: quax-0.0.4-py3-none-any.whl
  • Upload date:
  • Size: 35.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.12.4

File hashes

Hashes for quax-0.0.4-py3-none-any.whl
Algorithm Hash digest
SHA256 e6650bd959920111b372789dbd1252ffbdbb4cedc0ae27dbff40e058b0131155
MD5 10ec4004f137dbf85a9a88d996625c9d
BLAKE2b-256 c7751b8880ebe4f1c82fc3f52556ba45fbc9203b52dc4585bbd2fb71c6ef31a4

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page