Skip to main content

Stainless neural networks in JAX

Reason this release was yanked:

tree_partition critical bug

Project description

Inox's banner

Stainless neural networks in JAX

Inox is a minimal JAX library for neural networks with an intuitive PyTorch-like syntax. As with Equinox, modules are represented as PyTrees, which enables complex architectures, easy manipulations, and functional transformations.

Inox aims to be a leaner version of Equinox by only retaining its core features: PyTrees and lifted transformations. In addition, Inox takes inspiration from other projects like NNX and Serket to provide a versatile interface.

Inox means "stainless steel" in French 🔪

Installation

The inox package is available on PyPI, which means it is installable via pip.

pip install inox

Alternatively, if you need the latest features, you can install it from the repository.

pip install git+https://github.com/francois-rozet/inox

Getting started

Modules are defined with an intuitive PyTorch-like syntax,

import jax
import inox.nn as nn

init_key, data_key = jax.random.split(jax.random.key(0))

class MLP(nn.Module):
    def __init__(self, key):
        keys = jax.random.split(key, 3)

        self.l1 = nn.Linear(3, 64, key=keys[0])
        self.l2 = nn.Linear(64, 64, key=keys[1])
        self.l3 = nn.Linear(64, 3, key=keys[2])
        self.relu = nn.ReLU()

    def __call__(self, x):
        x = self.l1(x)
        x = self.l2(self.relu(x))
        x = self.l3(self.relu(x))

        return x

model = MLP(init_key)

and are compatible with JAX transformations.

X = jax.random.normal(data_key, (1024, 3))
Y = jax.numpy.sort(X, axis=-1)

@jax.jit
def loss_fn(model, x, y):
    pred = jax.vmap(model)(x)
    return jax.numpy.mean((y - pred) ** 2)

grads = jax.grad(loss_fn)(model, X, Y)

However, if a tree contains strings or boolean flags, it becomes incompatible with JAX transformations. For this reason, Inox provides lifted transformations that consider all non-array leaves as static.

model.name = 'stainless'

@inox.jit
def loss_fn(model, x, y):
    pred = inox.vmap(model)(x)
    return jax.numpy.mean((y - pred) ** 2)

grads = inox.grad(loss_fn)(model, X, Y)

For more information, check out the documentation at inox.readthedocs.io.

Contributing

If you have a question, an issue or would like to contribute, please read our contributing guidelines.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

inox-0.4.2.tar.gz (27.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

inox-0.4.2-py3-none-any.whl (32.3 kB view details)

Uploaded Python 3

File details

Details for the file inox-0.4.2.tar.gz.

File metadata

  • Download URL: inox-0.4.2.tar.gz
  • Upload date:
  • Size: 27.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.18

File hashes

Hashes for inox-0.4.2.tar.gz
Algorithm Hash digest
SHA256 3a55a7c5ef16af3f98636ff0732089ef5186065527ffd0d73210d34c7f1f9e7f
MD5 e4051b6ca74a6b76da4b1c7d4191d20a
BLAKE2b-256 428fac77f4b5937cf01bba4a14919a1d0e06b8d5dafc2f42e4d0734c867fe275

See more details on using hashes here.

File details

Details for the file inox-0.4.2-py3-none-any.whl.

File metadata

  • Download URL: inox-0.4.2-py3-none-any.whl
  • Upload date:
  • Size: 32.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.18

File hashes

Hashes for inox-0.4.2-py3-none-any.whl
Algorithm Hash digest
SHA256 88774eccfbe4302cf319d1151ecef8f93696a10c30018feb245c34070bebabfe
MD5 2dfba6689299aefc75903ff45fa57c6e
BLAKE2b-256 7cb0372b1ecc673336d2e0bc277fd15cf0303e3b6750bb76d46e285dfa188e45

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page