Skip to main content

A collection of utilities for machine learning applications.

Project description

iclearn

This project has tools for building production-focused machine learning workflows. It is used in some ICHEC projects and demos.

What is this project for?

Machine learning research often uses tools such as Jupyter Notebooks for exploring and iterating on models. On the other hand a production machine learning system will look quite different - focusing on reliability, error handling and standarization.

While Jupyter Notebooks work well for individual researchers they create issues in a team and collaborative setting, namely:

  • difficulty in working with version control
  • discourages testing and re-use of components
  • doesn't emphasise reliability and monitoring

This library aims to encourage collaborative work on machine learning problems, with outputs that are easier to deploy to a production setting than taking them from a Notebook. It does this by:

  • introducing opinionated APIs to standardize all elements of the workflow, down to which command line arguments to use in scripts.
  • including common, tested components that we can develop together and rely on
  • making best-practice tooling and choices prominent and easily available
  • supporting high performance and distributed processing by default

There are many libraries supporting machine learning workflows - by definition they need to be opinionated. This library gives us the option (which we may or may not always take) to use our own opions and to quickly add any features that we need. It is also a chance to become intimiately familiar with how machine learning workflows work.

Features

The project is both a library of utilities than can be used themselves in building a machine learning workflow and a template for quickly constructing workflows.

It has the following modules:

  • cli: A standarized set of CLI arguments for loading with argparse, allows you do to my_prog train, my_prog infer etc.
  • data: Data preparation, processing and loading utilities
  • environment: Sampling of the runtime environment to get info on available GPUs, CUDA features etc
  • model: Generic interface for machine learning models, holds things like the optimizer and metrics definitions. Similar to Keras.
  • output: Output handlers for logging, plotting, syncing with mlflow during training
  • utils: Utilities such as profilers

By specializing a model with a concrete implementation via class inheritence - including an optimizer and set of metrics it is possible to quickly assemble a script that can be used to train the model in a distributed compute environment, run inference or preview a dataset. As a quick pseudo-code example it will look something like:

from pathlib import Path

from iclearn.data import Dataloader, Splits
from iclearn.model import Model, Metrics

class MyModel(Model):

    def __init__(metrics: Metrics):
        super(metrics = metrics, MyOptimizer(MyLossFunc()))
        
    def predict(self, x):
        return ...
        
class MyDataloader(Dataloader):

    def load_dataset(root: Path, name: str, splits):
        return ...
        
    def load_dataloader(name: str):
        return...


def create_dataset(dataset_dir: Path, batch_size: int):
    return MyDataloader(dataset_dir, batch_size)

def create_model(num_classes: int, learning_rate: float, num_batches: int):
    return MyModel(num_classes, learning_rate, num_batches)


if __name__ == "__main__":

    parser = iclearn.cli.get_default_parser(create_dataset, create_model)

    args = parser.parse_args()
    args.func(args)

This will create a program which can be used to train a model with a rich set of command line arguments to control the process, for example:

myprog train --dataset_dir $DATASET_DIR --node_id 0 --gpus_per_node 2 --num_epochs 20

will run a multi-gpu training session using the data in $DATASET_DIR, outputting logs, environment information and model results.

Installing

The package is available on PyPI, you can install the base package with:

pip install iclearn

Most functionality so far uses PyTorch, you can install the PyTorch add-ons with:

pip install 'iclearn[torch]'

Running Tests

Install the base test dependencies with:

pip install '.[test]'

then you can run:

pytest test/

License

This software is Copyright ICHEC 2024 and can be re-used under the terms of the GPL v3+. See the included LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

iclearn-0.1.3.tar.gz (47.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

iclearn-0.1.3-py3-none-any.whl (52.3 kB view details)

Uploaded Python 3

File details

Details for the file iclearn-0.1.3.tar.gz.

File metadata

  • Download URL: iclearn-0.1.3.tar.gz
  • Upload date:
  • Size: 47.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.5

File hashes

Hashes for iclearn-0.1.3.tar.gz
Algorithm Hash digest
SHA256 ae93013c4ccd818ee367ee89a5251966a8dcb4218ae88ecf54f988ceaf18cb38
MD5 0cdf3a6d2e9bd8091616f73065d5ee6f
BLAKE2b-256 28cf394fa5acff0658b4ddbff955d3fb0536404c4b6187166c2a3225df12e11c

See more details on using hashes here.

File details

Details for the file iclearn-0.1.3-py3-none-any.whl.

File metadata

  • Download URL: iclearn-0.1.3-py3-none-any.whl
  • Upload date:
  • Size: 52.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.5

File hashes

Hashes for iclearn-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 c36c4134bdd83e730187d47841d6bb7d9902f6111505a3778b13604acd64116d
MD5 8c864c657b4cdd00d45cb2fbee2c9957
BLAKE2b-256 813e29e35b49a11b412024c76b052d92aea577b9fc255fcedf5d5d340d1eb0e1

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page