Skip to main content

A flexible trainer interface for Jax and Haiku.

Project description

Bax

Bax, short for "boilerplate jax", is a small library that provides a flexible trainer interface for Jax.

Bax is rather strongly opinionated in a few ways. First, it is designed for use with the Haiku neural network library and is not compatible with e.g. Flax. Second, Bax assumes that data will be provided as a tf.data.Dataset. The goal of this library is not to be widely compatible and high-level (like Elegy).

If you are okay with making the above assumptions, then Bax will hopefully make your life much easier by implementing the boilerplate code involved in neural network training loops.

Please note that this library has not yet been extensively tested.

Installation

You can install Bax via pip:

pip install bax

Usage

Below are some simple examples that illustrate how to use Bax.

MNIST Classification

import optax
import tensorflow_datasets as tfds
import haiku as hk
import jax.numpy as jnp
import jax

from bax.trainer import Trainer


# Use TensorFlow Datasets to get our MNIST data.
train_ds = tfds.load("mnist", split="train").batch(32, drop_remainder=True)
test_ds = tfds.load("mnist", split="test").batch(32, drop_remainder=True)

# The loss function that we want to minimize.
def loss_fn(step, is_training, batch):
    model = hk.Sequential([hk.Flatten(), hk.nets.MLP([128, 128, 10])])

    preds = model(batch["image"] / 255.0)
    labels = jax.nn.one_hot(batch["label"], 10)

    loss = jnp.mean(optax.softmax_cross_entropy(preds, labels))
    accuracy = jnp.mean(jnp.argmax(preds, axis=-1) == batch["label"])

    # The first returned value is the loss, which is what will be minimized by the
    # trainer. The second value is a dictionary that can contain other metrics you
    # might be interested in (or, it can just be empty).
    return loss, {"accuracy": accuracy}

trainer = Trainer(loss=loss_fn, optimizer=optax.adam(0.001))

# Run the training loop. Metrics will be printed out each time the validation
# dataset is evaluated (in this case, every 1000 steps).
trainer.fit(train_ds, steps=10000, val_dataset=test_ds, validation_freq=1000)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

bax-0.1.9.tar.gz (9.9 kB view details)

Uploaded Source

Built Distribution

bax-0.1.9-py3-none-any.whl (9.7 kB view details)

Uploaded Python 3

File details

Details for the file bax-0.1.9.tar.gz.

File metadata

  • Download URL: bax-0.1.9.tar.gz
  • Upload date:
  • Size: 9.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/32.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.8 tqdm/4.63.0 importlib-metadata/4.11.2 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.9.10

File hashes

Hashes for bax-0.1.9.tar.gz
Algorithm Hash digest
SHA256 6a97cfeb353628098b0710b7795ab07751bcb2bf6719894f204fa9ea45265035
MD5 91632b70763e82544d153d83715fdd1f
BLAKE2b-256 3ecdae0aa7e7fae358662c9777eab820329a1e48650f9c02393d066a018173c1

See more details on using hashes here.

File details

Details for the file bax-0.1.9-py3-none-any.whl.

File metadata

  • Download URL: bax-0.1.9-py3-none-any.whl
  • Upload date:
  • Size: 9.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/32.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.8 tqdm/4.63.0 importlib-metadata/4.11.2 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.9.10

File hashes

Hashes for bax-0.1.9-py3-none-any.whl
Algorithm Hash digest
SHA256 8d06484f8c808f56a2394ac34d3e2ed8ce04aeca6a4214fc746a0df4f8763729
MD5 09afc8a35803fa5e0c8d46dbe6b7d3b4
BLAKE2b-256 c2e78033ec5c5d5b4d9a7d3921a005783384a3c4885a5d63ea591cffce195602

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page