Minimal data loader for Flax
Project description
loaderx
Minimal data loader for Flax
Rationale for Creating loaderx
While Flax supports various data loading backends—such as PyTorch, TensorFlow, Grain, and jax_dataloader.
- Installing heavy frameworks like PyTorch or TensorFlow solely for data loading is undesirable.
- Grain offers a clean API but suffers from suboptimal performance in practice.
- jax_dataloader leverages GPU memory by default, which may lead to inefficient memory usage in certain scenarios.
Design Goals of loaderx
loaderx is designed with simplicity and efficiency in mind. It follows a pragmatic approach—favoring low memory overhead and minimal dependencies. The implementation targets common use cases, with a particular focus on single-host training pipelines.
Current Limitations
At present, loaderx only supports single-host scenarios and does not yet address multi-host training setups.
How to integrate it with Flax.
The loaderx is mainly inspired by the design of Grain, so avoid using patterns like for epoch in num_epochs.
The following is a Flax code for train and valid.
def loss_fn(model: CNN, batch):
logits = model(batch['data'])
loss = optax.softmax_cross_entropy_with_integer_labels(logits=logits, labels=batch['label']).mean()
return loss, logits
@nnx.jit
def train_step(model: CNN, optimizer: nnx.Optimizer, metrics: nnx.MultiMetric, batch):
"""Train for a single step."""
grad_fn = nnx.value_and_grad(loss_fn, has_aux=True)
(loss, logits), grads = grad_fn(model, batch)
metrics.update(loss=loss, logits=logits, labels=batch['label']) # In-place updates.
optimizer.update(grads) # In-place updates.
@nnx.jit
def eval_step(model: CNN, metrics: nnx.MultiMetric, batch):
loss, logits = loss_fn(model, batch)
metrics.update(loss=loss, logits=logits, labels=batch['label']) # In-place updates.
@nnx.jit
def pred_step(model: CNN, batch):
logits = model(batch['data'])
return logits.argmax(axis=1)
train_loader = loader(npz_path='data/mnist.npz', num_epoch=10)
for step, batch in enumerate(train_loader):
train_step(model, optimizer, metrics, batch)
if step > 0 and step % 500 == 0:
train_metrics = metrics.compute()
print("Step:{}_Train Acc@1: {} loss: {} ".format(step,train_metrics['accuracy'],train_metrics['loss']))
metrics.reset() # Reset the metrics for the train set.
# Compute the metrics on the test set after each training epoch.
val_loader = loader(npz_path='data/mnist.npz', num_epoch=1)
for val_batch in val_loader:
eval_step(model, metrics, val_batch)
val_metrics = metrics.compute()
print("Step:{}_Val Acc@1: {} loss: {} ".format(step,val_metrics['accuracy'],val_metrics['loss']))
metrics.reset() # Reset the metrics for the val set.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file loaderx-0.0.4.tar.gz.
File metadata
- Download URL: loaderx-0.0.4.tar.gz
- Upload date:
- Size: 4.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.11.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
17fde872b99018e0c2f297edd3460a463413f11392a2bfe458e89fa4c496044f
|
|
| MD5 |
5d17a23b8fb9edf41fd0b233998c7029
|
|
| BLAKE2b-256 |
d0728cefb527e71e1284a60c5e576b069a709cd0dbb60f8d415302e138bc0d6c
|
File details
Details for the file loaderx-0.0.4-py3-none-any.whl.
File metadata
- Download URL: loaderx-0.0.4-py3-none-any.whl
- Upload date:
- Size: 6.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.11.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
12fe8452cc8cad2dac34557f8605d2a509c8e80da7783c4595fa3530f1ee5275
|
|
| MD5 |
68c9f9c1274fec8e4f64e9255e35928c
|
|
| BLAKE2b-256 |
96c5e19e5792d511c60f45d9d263afc87f69cb0249b34c753749cc313540ca7f
|