A Flax trainer
Project description
XTRAIN: a tiny library for training Flax models.
Design goals:
- Help avoiding boiler-plate code
- Minimal functionality and dependency
- Agnostic to hardware configuration (e.g. GPU->TPU)
General workflow
Step 1: define your model
class MyFlaxModule(nn.Module):
@nn.compact
def __call__(self, x):
...
Step 2: define loss function
def my_loss_func(batch, prediction):
x, y_true = batch
loss = ....
return loss
Step 3: create an iterator that supplies training data
my_data = zip(sequence_of_inputs, sequence_of_labels)
Step 4: train
# create and initialize a Trainer object
trainer = xtrain.Trainer(
model = MyFlaxModule(),
losses = my_loss_func,
optimizer = optax.adam(1e-4),
)
train_iter = trainer.train(my_data) # returns a iterable object
# iterate the train_iter trains the model
for epoch in range(3):
for model_out in train_iter:
pass
print(train_iter.loss_logs)
train_iter.reset_loss_logs()
Training data format
- tensowflow Dataset
- torch dataloader
- generator function
- other python iterable that produce numpy data
Checkpointing
train_iter is orbax compatible.
import orbax.checkpoint as ocp
ocp.StandardCheckpointer().save(cp_path, args=ocp.args.StandardSave(train_iter))
Freeze submodule
train_iter.freeze("submodule/Dense_0/kernel")
Simple batch parallelism on multiple device
# Add a new batch dim to you dataset
ds = ds.batch(8)
# create trainer with the Distributed strategy
trainer_iter = xtrain.Trainer(model, losses, optimizer, strategy=xtrain.Distributed).train(ds)
API documentation
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
xtrain-0.3.8.tar.gz
(12.6 kB
view details)
Built Distribution
xtrain-0.3.8-py3-none-any.whl
(14.1 kB
view details)
File details
Details for the file xtrain-0.3.8.tar.gz
.
File metadata
- Download URL: xtrain-0.3.8.tar.gz
- Upload date:
- Size: 12.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.3 CPython/3.10.14 Linux/6.5.0-1025-azure
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | f6fe954b4c436beda687de4fd9d35cf827424ea890dedc34c2c9e720d5389aa3 |
|
MD5 | 754651f5e7cca70a43bae91d6aa8d29a |
|
BLAKE2b-256 | 617911dc30a57b2cf12a5f1d044cf59043dad6b8dad9f87ccc6030b186fdcf91 |
File details
Details for the file xtrain-0.3.8-py3-none-any.whl
.
File metadata
- Download URL: xtrain-0.3.8-py3-none-any.whl
- Upload date:
- Size: 14.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.3 CPython/3.10.14 Linux/6.5.0-1025-azure
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | a251856d6cbacfe937a305de25b68f325169ca882678629ba9555009281d7192 |
|
MD5 | f58f132fdaf5986e6d910dc82e1f0d01 |
|
BLAKE2b-256 | 5d46264a5be29b4d88d0ec91c763b29a2421c7adcbac2499cbb95b73257c8a4f |