A Simplistic trainer for Flax
Project description
Flax-Pilot
Flax-Pilot aims to simplify the process of writing training loops for Google's Flax framework. As someone new to Flax, I started this project to deepen my understanding. This module represents a beginner's exploration into building efficient training workflows, emphasizing the need for further expertise to refine and expand its capabilities. Future plans include integrating multiple optimizer training, diverse metric modules, callbacks, and advancing towards more complex training loops, aiming to enhance its functionality and versatility. Flax-Pilot supports distributed training, ensuring scalability and efficiency across multiple devices.
As of 27-7-2024, the trainer is available as package
How to Use?
🛠️ Write a flax.linen Module
import flax.linen as nn
class CNN(nn.Module):
@nn.compact
def __call__(self, x, deterministic):
x = nn.Conv(features=32, kernel_size=(3, 3))(x)
x = nn.avg_pool(x, window_shape=(2, 2), strides=(2, 2))
x = x.reshape((x.shape[0], -1))
x = nn.Dense(features=256)(x)
x = nn.Dropout(rate=0.5, deterministic=deterministic)(x)
x = nn.Dense(features=10)(x)
return x
🔧 Define Optimizer, Input Shapes, and Dict of Loss & Metric Trackers
Loss trackers (lt) takes in scalar loss value and averages it throughout training.
Metric trackers (mt) takes in y_true, y_pred and computes metric score and averages throughout training.
import optax as tx
opt = tx.adam(0.0001)
input_shape = {'x': (1, 28, 28, 1)}
from fpilot import BasicTrackers as tr
# Create tracker instances.
loss_metric_tracker_dict = {
'lt': {'loss': tr.Mean()},
'mt': {'F1': tr.F1Score(threshold=0.6, num_classes=10, average='macro')}
}
🧮 Create loss_fn
A function that takes these certain params as written below in the code and returns scalar loss, dict of loss & metrics values.
Key names lt, mt shouldn't be changed anywhere, as training loops depend on those keys. Subkey names, loss, F1 are free to be changed
but must match across loss_metric_tracker_dict and loss_metric_value_dict.
import optax as tx
# This fn's 1st return value is differentiated wrt the fn's first param.
def loss_fn(params, apply, sample, deterministic, det_key, step):
x, y = sample
yp = apply(params, x, deterministic=deterministic, rngs={'dropout': det_key})
loss = tx.softmax_cross_entropy(y, yp).mean()
loss_metric_value_dict = {'lt': {'loss': loss}, 'mt': {'F1': (y, yp)}}
return loss, loss_metric_value_dict
🏋️ Create Trainer Instance
from fpilot import Trainer
trainer = Trainer(CNN(), input_shape, optimizer, loss_fn, loss_metric_tracker_dict)
📈 Train the Model & Evaluate
train_ds = ... # tf.data.Dataset as numpy iterator
val_ds = ... # tf.data.Dataset as numpy iterator
train_steps, val_steps = 10000, 1000 # steps per epoch
ckpt_path = "/saved/model/model_1" # If set to None, no checkpoints will be saved during training.
trainer.train(epochs, train_ds, val_ds, train_steps, val_steps, ckpt_path)
Demo
Review the 'examples' folder for training tutorials. The vae-gan-cfg-using-pretrained
notebook demonstrates how to use
the trainer as a Python package, while the other notebooks show how to use the trainer with git clone.
Therefore, see the vae-gan-cfg-using-pretrained for a more simpler training.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file flax_pilot-0.1.8.tar.gz
.
File metadata
- Download URL: flax_pilot-0.1.8.tar.gz
- Upload date:
- Size: 10.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.10.13
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 38ac089705f36e16638d1b85e7c2737841638c48963a47bcaf63b714b6f807b8 |
|
MD5 | 02232b504796f77ef396b6c8590ef56c |
|
BLAKE2b-256 | d73b795c2514b21df071dc5d1e86261920a47bbca2c432f394e8c3ddf466c767 |
Provenance
File details
Details for the file flax_pilot-0.1.8-py3-none-any.whl
.
File metadata
- Download URL: flax_pilot-0.1.8-py3-none-any.whl
- Upload date:
- Size: 12.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.10.13
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 5d4b07514c7181152241d23623a0e3d55a631b00ce438999acc80a09215efa4e |
|
MD5 | 15ef6f929c5e01eb6c5a316d88696457 |
|
BLAKE2b-256 | 155616fac0e58473e85170daebbb1e863442b980d9563a65a0a89306bd132862 |