Skip to main content

Utilities for training models in pytorch

Project description

xt-training

Description

This repo contains utilities for training deep learning models in pytorch, developed by Xtract AI.

Installation

From PyPI:

pip install xt-training

From source:

git clone https://github.com/XtractTech/xt-training.git
pip install ./xt-training

Usage

See specific help on a class or function using help. E.g., help(Runner).

Training a model

from xt_training import Runner, metrics
from torch.utils.tensorboard import SummaryWriter

# Here, define class instances for the required objects
# model = 
# optimizer = 
# scheduler = 
# loss_fn = 

# Define metrics - each of these will be printed for each iteration
# Either per-batch or running-average values can be printed
batch_metrics = {
    'eps': metrics.EPS(),
    'acc': metrics.Accuracy(),
    'kappa': metrics.Kappa(),
    'cm': metrics.ConfusionMatrix()
}

# Define tensorboard writer
writer = SummaryWriter()

# Create runner
runner = Runner(
    model=model,
    loss_fn=loss_fn,
    optimizer=optimizer,
    scheduler=scheduler,
    batch_metrics=batch_metrics,
    device='cuda:0',
    writer=writer
)

# Define dataset and loaders
# dataset = 
# train_loader = 
# val_loader = 

# Train
model.train()
runner(train_loader)
batch_metrics['cm'].print()

# Evaluate
model.eval()
runner(val_loader)
batch_metrics['cm'].print()

# Print training and evaluation history
print(runner)

Scoring a model

import torch
from xt_training import Runner

# Here, define the model
# model = 
# model.load_state_dict(torch.load(<checkpoint file>))

# Create runner
# (alternatively, can use a fully-specified training runner as in the example above)
runner = Runner(model=model, device='cuda:0')

# Define dataset and loaders
# dataset = 
# test_loader = 

# Score
model.eval()
y_pred, y = runner(test_loader, return_preds=True)

Data Sources

[descriptions and links to data]

Dependencies/Licensing

[list of dependencies and their licenses, including data]

References

[list of references]

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

xt-training-1.1.3.tar.gz (7.3 kB view details)

Uploaded Source

Built Distribution

xt_training-1.1.3-py3-none-any.whl (7.9 kB view details)

Uploaded Python 3

File details

Details for the file xt-training-1.1.3.tar.gz.

File metadata

  • Download URL: xt-training-1.1.3.tar.gz
  • Upload date:
  • Size: 7.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/40.8.0 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.7.3

File hashes

Hashes for xt-training-1.1.3.tar.gz
Algorithm Hash digest
SHA256 18ae3f029640825a0cf5bb80036282d95539b936b393d7825f6be024b0caacfb
MD5 0559389a7f20476559699b98c3ba0119
BLAKE2b-256 1ef7c7dbd3a17beeeaef2c1e211b9869888b34fde3fb818ce9a1a37bda92672c

See more details on using hashes here.

File details

Details for the file xt_training-1.1.3-py3-none-any.whl.

File metadata

  • Download URL: xt_training-1.1.3-py3-none-any.whl
  • Upload date:
  • Size: 7.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/40.8.0 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.7.3

File hashes

Hashes for xt_training-1.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 88bbfadaf092ff3755f17ebf043bea79e273601173387878c266dc976968a97e
MD5 09046fe80b4c52ef5b89987efb772b77
BLAKE2b-256 6cc1fc9d15fb205f24746b65bf9eb6c44dcfdc9b48b7a9ae7a2b414024a4f7e9

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page