Top-level package for modularyze.
Project description
Modularyze is a modular, composable and dynamic configuration engine that mixes the power of dynamic webpage rendering with that of YAML. It relies on Jinja and ruamel.yaml and inherits their flexibility.
Quick Start
Installation
To install the latest version of modularyze, run this command in your terminal:
$ pip install modularyze
Example
The Modularize package exposes one central config-builder class called ConfBuilder. Using this class you can register arbitrary constructors and callables, render templated multi-file and dynamic configs, instantiate them and compare configs by hash or their normalized form.
To use modularyze in a project simply import it, register any callables your config might be using and point it to your configuration file. From there you can simply call build to build the config.
A simple example where we instantiate a machine learning pipeline could look something like this:
# File: imagenet.yaml
{% set use_pretrained = use_pretrained | default(True) %}
{% set imagenet_root = imagenet_root | default('datasets/imagenet') %}
network: &network
!torchvision.models.resnet18
pretrained: {{ use_pretrained }}
val_transforms: &val_transforms
!torchvision.transforms.Compose
- !torchvision.transforms.Resize [256]
- !torchvision.transforms.CenterCrop [224]
- !torchvision.transforms.ToTensor
dataset: &dataset
!torchvision.transforms.datasets.ImageNet
args:
- {{ imagenet_root }}
kwargs:
split: 'val'
transforms: *val_transforms
import torchvision
from modularyze import ConfBuilder
builder = ConfBuilder()
builder.register_multi_constructors_from_modules(torchvision)
conf = builder.build('imagenet.yaml')
Now the conf object is a python dictionary containing a fully initialized model, dataset and validation transforms. What about if you want to change a parameter on the fly? Say the imagenet folder changes? Easy, simply pass in a context:
conf = builder.build('imagenet.yaml', context={"imagenet_root": "new/path/to/dataset"})
In this way ypu can easily parameterize you configuration files. The provided context is usually a dictionary but it can even be the path to a (non-parameterized/vanilla) YAML file.
What about if we have the configuration for a model trainer in a different file? Imagine the file trainer.yaml instantiates a neural network trainer instance, we can include it by adding the following line to the above config file:
{% include 'trainer.yaml' %}
There are many more neat things you can do when you combine the powers of YAML and Jinja, please refer to the documentation for more.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file modularyze-0.1.0.tar.gz
.
File metadata
- Download URL: modularyze-0.1.0.tar.gz
- Upload date:
- Size: 13.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.1.4 CPython/3.8.3 Windows/10
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 1132b7b433c3d14ee1150b2cfe955b8b44a8486dbb1731853bb3175c81d9d9d6 |
|
MD5 | 84747ab2c4dc89976c4736fc9b47a41c |
|
BLAKE2b-256 | 731ddc23c622cee45a38be9b18315039419f7209680c57ae99e93fc607e9e69e |
File details
Details for the file modularyze-0.1.0-py3-none-any.whl
.
File metadata
- Download URL: modularyze-0.1.0-py3-none-any.whl
- Upload date:
- Size: 10.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.1.4 CPython/3.8.3 Windows/10
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 477a8e75676d9f338616d454ea1159eff0e7d3d0e22eeab08fd49adb0f3b9455 |
|
MD5 | 83300587b4c5f6b322b939212dc6d070 |
|
BLAKE2b-256 | 999d38ad931fac912872fcea1659231feb3d59a466596ccc764703e72a34c669 |