Skip to main content

A small framework for training and evaluating models in notebooks with pytorch

Reason this release was yanked:

test

Project description

A lean framework for developing, training and testing models. Inspired by the d2l framework of DataModule, Trainer, Module objects, this project builds on that paradigm by generalizing and adding features. Develop, train, and retrain your model in just a couple blocks of readable code!

Features

  • A magic Config class which allows flexible and hassle-free of training hyperparameters and configuration. Only type what you need! Especially useful for hyperparameter search.

  • Datamodule wrapping Dataset

    • Various methods for viewing your data. Know your data intimately.
    • Creates train and validation Dataloaders with sensible options.
    • Visualizes image data, provides encoding and decoding for ClassifierData.
    • Easily configure data splitting, and iterating over splits using sklearn cross validators.
    • Autodatset creation from Dataframes and Tensors.
  • Some convenient features added to torch modules allowing automatic naming, saving, loading, layer statistics. A pred method allows all evaluators of the classification variant to automatically use the prediction rather than forward output.

  • A fully featured training loop preprovisioned methods to activate the following features:

    • Automatic model saving and loading. The load_previous kwarg allows Training to resume from saved model/optimizer/scheduler parameters.
    • Auto-gpu discovery, as well as moving the data to the right device for various functions
    • Realtime plotting of loss curves as well as other declarable metrics. Supports both batch and epoch units, depending on whetehr the dataloader is iterable or miniepochs are used.
    • Callbacks for training loop customization
    • Easily log training-time metrics or save them to parquet (todo).
    • A convenient method for evaluating loaded models for one epoch.
    • DistributedDataParallel functionality (IP)
  • A MetricFrame class which aggregates metrics compatible with torcheval.

    • Integrates with the infer and fit methods of the Trainer class to allow staggered recording and realtime plotting.
    • Smart units ensure minimal math is needed to determine appropriate parameters
  • Convenient utils for data processing, statistic plotting, ndarray manipulation and more

Installation

You can install torch-nibs via pip:

pip install torch-nibs

Note that this project was created with pixi. If you are installing with pip, you will also need the following dependencies:

"torch>=1.10.0",
"torchvision>=0.11.0",
"torchaudio>=0.10.0",
"polars",
"wandb",
"jupyter",
"pip",
"ipympl",
"plotly",
"tqdm",
"seaborn>=0.13.2,<0.14",
"scikit-learn>=1.5.2,<2",
"openpyxl>=3.1.5,<4",
"fastexcel>=0.12.0,<0.13",
"pandas>=2.2.3,<3",
"datasets>=3.2.0,<4; extra == 'huggingface'",
"jupyter_console>=6.6.3,<7",

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

torch_nibs-0.0.3.tar.gz (30.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

torch_nibs-0.0.3-py3-none-any.whl (39.9 kB view details)

Uploaded Python 3

File details

Details for the file torch_nibs-0.0.3.tar.gz.

File metadata

  • Download URL: torch_nibs-0.0.3.tar.gz
  • Upload date:
  • Size: 30.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: python-httpx/0.28.1

File hashes

Hashes for torch_nibs-0.0.3.tar.gz
Algorithm Hash digest
SHA256 fc01c43843ef34d3ac0d544fda1497491a66f5d996007c0b381bbf7a9f2c512e
MD5 f9c70df70571938a2e57e9d71ddb29c9
BLAKE2b-256 1822fb7e6ad7ea8da13a6ee71cbcc32a05a7b9ad7a293c70aa8aaa932aa076db

See more details on using hashes here.

File details

Details for the file torch_nibs-0.0.3-py3-none-any.whl.

File metadata

  • Download URL: torch_nibs-0.0.3-py3-none-any.whl
  • Upload date:
  • Size: 39.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: python-httpx/0.28.1

File hashes

Hashes for torch_nibs-0.0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 ed384612e60418b2239bf63844824c2a9a019199709d978c92bb09ece3a3faee
MD5 deb840195f0868c8377a63e0bac714c6
BLAKE2b-256 78e40a474056315de90203e4dec52463b10b85c973e202a4f9a3454e70fda361

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page