A Simplistic trainer for Flax
Project description
Flax-Pilot
Flax-Pilot aims to simplify the process of writing training loops for Google's Flax framework. As someone new to Flax, I started this project to deepen my understanding. This module represents a beginner's exploration into building efficient training workflows, emphasizing the need for further expertise to refine and expand its capabilities. Future plans include integrating multiple optimizer training, diverse metric modules, callbacks, and advancing towards more complex training loops, aiming to enhance its functionality and versatility. Flax-Pilot supports distributed training, ensuring scalability and efficiency across multiple devices.
As of 27-7-2024, the trainer is available as package for GPU & CPU.
How to Use?
🛠️ Write a flax.linen Module
import flax.linen as nn
class CNN(nn.Module):
@nn.compact
def __call__(self, x, deterministic):
x = nn.Conv(features=32, kernel_size=(3, 3))(x)
x = nn.avg_pool(x, window_shape=(2, 2), strides=(2, 2))
x = x.reshape((x.shape[0], -1))
x = nn.Dense(features=256)(x)
x = nn.Dropout(rate=0.5, deterministic=deterministic)(x)
x = nn.Dense(features=10)(x)
return x
🔧 Define Optimizer, Input Shapes, and Dict of Loss & Metric Trackers
Loss trackers (lt) takes in scalar loss value and averages it throughout training.
Metric trackers (mt) takes in y_true, y_pred and computes metric score and averages throughout training.
import optax as tx
opt = tx.adam(0.0001)
input_shape = {'x': (1, 28, 28, 1)}
from fpilot import BasicTrackers as tr
# Create tracker instances.
loss_metric_tracker_dict = {
'lt': {'loss': tr.Mean()},
'mt': {'F1': tr.F1Score(threshold=0.6, num_classes=10, average='macro')}
}
🧮 Create loss_fn
A function that takes these certain params as written below in the code and returns scalar loss, dict of loss & metrics values.
Key names lt, mt shouldn't be changed anywhere, as training loops depend on those keys. Subkey names, loss, F1 are free to be changed
but must match across loss_metric_tracker_dict and loss_metric_value_dict.
import optax as tx
# This fn's 1st return value is differentiated wrt the fn's first param.
def loss_fn(params, apply, sample, deterministic, det_key, step):
x, y = sample
yp = apply(params, x, deterministic=deterministic, rngs={'dropout': det_key})
loss = tx.softmax_cross_entropy(y, yp).mean()
loss_metric_value_dict = {'lt': {'loss': loss}, 'mt': {'F1': (y, yp)}}
return loss, loss_metric_value_dict
🏋️ Create Trainer Instance
from fpilot import Trainer
trainer = Trainer(CNN(), input_shape, optimizer, loss_fn, loss_metric_tracker_dict)
📈 Train the Model & Evaluate
train_ds = ... # tf.data.Dataset as numpy iterator
val_ds = ... # tf.data.Dataset as numpy iterator
train_steps, val_steps = 10000, 1000 # steps per epoch
ckpt_path = "/saved/model/model_1" # If set to None, no checkpoints will be saved during training.
trainer.train(epochs, train_ds, val_ds, train_steps, val_steps, ckpt_path)
What's next?
- Seperate package for TPU.
- Callbacks.
- TensorBoard logging.
Demo
Review the 'examples' folder for training tutorials. The vae-gan-cfg-using-pretrained
notebook demonstrates how to use
the trainer as a Python package, while the other notebooks show how to use the trainer with git clone.
Therefore, see the vae-gan-cfg-using-pretrained for a more simpler training.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file flax_pilot-0.1.13.tar.gz
.
File metadata
- Download URL: flax_pilot-0.1.13.tar.gz
- Upload date:
- Size: 11.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.10.13
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 3c7148036fb1038695256990a0a62990a6a8a6412a2882477a8bbe9d1578fb49 |
|
MD5 | 9830483620fd35bf2d9f6aa407e09de2 |
|
BLAKE2b-256 | 3b5e1cad258f0165b18ecb59d6c015b04a8a213cb5733742757c28f5366a09d0 |
Provenance
File details
Details for the file flax_pilot-0.1.13-py3-none-any.whl
.
File metadata
- Download URL: flax_pilot-0.1.13-py3-none-any.whl
- Upload date:
- Size: 12.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.10.13
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 1f8f3995a52da8458e9904f64985967795f97e94ef1dbc016bd973ea32ab2b1f |
|
MD5 | 7534c14c446959ec8b041bf814cb9879 |
|
BLAKE2b-256 | c74d25a014aa976ac7d834a849ae3ce29ca2319a31e69df318a555307ef08546 |