Skip to main content

A small framework for training and evaluating models in notebooks with pytorch

Reason this release was yanked:

test

Project description

A lean framework for developing, training and testing models. Inspired by the d2l framework of DataModule, Trainer, Module objects, this project builds on that paradigm by generalizing and adding features. Develop, train, and retrain your model in just a couple blocks of readable code!

Features

  • A magic Config class which allows flexible and hassle-free of training hyperparameters and configuration. Only type what you need! Especially useful for hyperparameter search.

  • Datamodule wrapping Dataset

    • Various methods for viewing your data. Know your data intimately.
    • Creates train and validation Dataloaders with sensible options.
    • Visualizes image data, provides encoding and decoding for ClassifierData.
    • Easily configure data splitting, and iterating over splits using sklearn cross validators.
    • Autodatset creation from Dataframes and Tensors.
  • Some convenient features added to torch modules allowing automatic naming, saving, loading, layer statistics. A pred method allows all evaluators of the classification variant to automatically use the prediction rather than forward output.

  • A fully featured training loop preprovisioned methods to activate the following features:

    • Automatic model saving and loading. The load_previous kwarg allows Training to resume from saved model/optimizer/scheduler parameters.
    • Auto-gpu discovery, as well as moving the data to the right device for various functions
    • Realtime plotting of loss curves as well as other declarable metrics. Supports both batch and epoch units, depending on whetehr the dataloader is iterable or miniepochs are used.
    • Callbacks for training loop customization
    • Easily log training-time metrics or save them to parquet (todo).
    • A convenient method for evaluating loaded models for one epoch.
    • DistributedDataParallel functionality (IP)
  • A MetricFrame class which aggregates metrics compatible with torcheval.

    • Integrates with the infer and fit methods of the Trainer class to allow staggered recording and realtime plotting.
    • Smart units ensure minimal math is needed to determine appropriate parameters
  • Convenient utils for data processing, statistic plotting, ndarray manipulation and more

Installation

You can install torch-nibs via pip:

pip install torch-nibs

Note that this project was created with pixi. If you are installing with pip, you will also need the following dependencies:

"torch>=1.10.0",
"torchvision>=0.11.0",
"torchaudio>=0.10.0",
"polars",
"wandb",
"jupyter",
"pip",
"ipympl",
"plotly",
"tqdm",
"seaborn>=0.13.2,<0.14",
"scikit-learn>=1.5.2,<2",
"openpyxl>=3.1.5,<4",
"fastexcel>=0.12.0,<0.13",
"pandas>=2.2.3,<3",
"datasets>=3.2.0,<4; extra == 'huggingface'",
"jupyter_console>=6.6.3,<7",

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

torch_nibs-0.0.2.tar.gz (30.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

torch_nibs-0.0.2-py3-none-any.whl (39.7 kB view details)

Uploaded Python 3

File details

Details for the file torch_nibs-0.0.2.tar.gz.

File metadata

  • Download URL: torch_nibs-0.0.2.tar.gz
  • Upload date:
  • Size: 30.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: python-httpx/0.28.1

File hashes

Hashes for torch_nibs-0.0.2.tar.gz
Algorithm Hash digest
SHA256 eb72fc157da301db56e36737014ef241f62618ca3379368e9355319f493613de
MD5 faad6681fad41a433b3732de052d8058
BLAKE2b-256 9104a998b34435141fa3f8fb79e11a1084f10c3126a95eef422d6cddf75b3f2f

See more details on using hashes here.

File details

Details for the file torch_nibs-0.0.2-py3-none-any.whl.

File metadata

  • Download URL: torch_nibs-0.0.2-py3-none-any.whl
  • Upload date:
  • Size: 39.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: python-httpx/0.28.1

File hashes

Hashes for torch_nibs-0.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 3d6827663cb9bb6340cf1f509e1fa68cee2e9df185746e6d320367c8e49805b2
MD5 fb99cb7f269f62265b3a3a4bbb08d807
BLAKE2b-256 14c0264f8c0585e99360bc607f372d6ccdfbbb5365f532857830d2c2f58a2061

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page