Skip to main content

Enchanter is a library for machine learning tasks for comet.ml users.

Project description

Enchanter

Enchanter is a library for machine learning tasks for comet.ml users.

Getting StartedDocsTutorialLicence

Codacy Badge Build & Publish PyPI Documentation Status

CI macOS CI Linux license Code style: black Using PyTorch


Installation

To get started, install PyTorch for your environment. Then install Enchanter in the following way:

To install the stable release.

pip install enchanter

or

To install the latest(unstable) release.

pip install git+https://github.com/khirotaka/enchanter.git

If you want to install with a specific branch, you can use the following.

# e.g.) Install enchanter from develop branch.
pip install git+https://github.com/khirotaka/enchanter.git@develop

Supported Platforms

Enchanter supports:

  • macOS 10.15
  • Ubuntu 18.04 or later

Getting Started

Try your first Enchanter Program. To train a neural network written in PyTorch on Enchanter, use the Runner.
There are 2 ways to define a Runner:

  1. To use a Runner already implemented under enchanter.tasks
  2. To define a custom Runner that inherit enchanter.engine.BaseRunner.

Let's see how to use the enchanter.tasks.ClassificationRunner, which is the easiest way.

Training Neural Network

import comet_ml
import torch
import enchanter

model = torch.nn.Linear(6, 10)
optimizer = torch.optim.Adam(model.parameters())

runner = enchanter.tasks.ClassificationRunner(
    model, 
    optimizer,
    criterion=torch.nn.CrossEntropyLoss(),
    experiment=comet_ml.Experiment()
)

runner.add_loader("train", train_loader)
runner.train_config(epochs=10)
runner.run()

Register a torch.utils.data.DataLoader with the Runner by using .add_loader().
Set up the number of epochs using .train_config(), and execute Runner with .run().

Training Unsupervised Time Series Feature Learning

The wonderful algorithms for unsupervised time series representation learning, adopted at NeurIPS 2019, are now easily available.

Please prepare the following:

  1. PyTorch Model that can output feature vectors of the same length regardless of the input series.
  2. time series data consisting of [N, F, L].
  3. (Optional) A teacher label for each sample in 2.
import comet_ml
import torch.nn as nn
import torch.optim as optim
import enchanter.tasks as tasks
import enchanter.addons.layers as L


class Encoder(nn.Module):
    def __init__(self, in_features, mid_features, out_features):
        super(Encoder, self).__init__()
        self.conv = nn.Sequential(
            L.CausalConv1d(in_features, mid_features, 3),
            nn.LeakyReLU(),
            L.CausalConv1d(mid_features, mid_features, 3),
            nn.LeakyReLU(),
            L.CausalConv1d(mid_features, mid_features, 3),
            nn.LeakyReLU(),
            nn.AdaptiveMaxPool1d(1)
        )
        self.fc = nn.Linear(mid_features, out_features)

    def forward(self, x):
        batch = x.shape[0]
        out = self.conv(x).reshape(batch, -1)
        return self.fc(out)


experiment = comet_ml.Experiment()
model = Encoder(...)
optimizer = optim.Adam(model.parameters())

runner = tasks.TimeSeriesUnsupervisedRunner(model, optimizer, experiment)
runner.add_loader("train", ...)
runner.run()

A teacher label is required for validation. Also, Use enchanter.callbacks.EarlyStoppingForTSUS for early stopping.

Hyper parameter searching using Comet.ml

from comet_ml import Optimizer

import torch
import torch.nn as nn
import torch.optim as optim
from sklearn.datasets import load_iris

import enchanter.tasks as tasks
import enchanter.addons as addons
import enchanter.addons.layers as layers
from enchanter.utils import comet


config = comet.TunerConfigGenerator(
    algorithm="bayes",
    metric="train_avg_loss",
    objective="minimize",
    seed=0,
    trials=1,
    max_combo=10
)

config.suggest_categorical("activation", ["addons.mish", "torch.relu", "torch.sigmoid"])
opt = Optimizer(config.generate())

x, y = load_iris(return_X_y=True)
x = x.astype("float32")
y = y.astype("int64")


for experiment in opt.get_experiments():
    model = layers.MLP([4, 512, 128, 3], eval(experiment.get_parameter("activation")))
    optimizer = optim.Adam(model.parameters())
    runner = tasks.ClassificationRunner(
        model, optimizer=optimizer, criterion=nn.CrossEntropyLoss(), experiment=experiment
    )

    runner.fit(x, y, epochs=1, batch_size=32)
    runner.quite()

    # or 
    # with runner:
    #   runner.fit(...)
    # or
    #   runner.run()

Training with Mixed Precision

Runners with defined in enchanter.tasks are now support Auto Mixed Precision.
Write the following.

from torch.cuda import amp
from enchanter.tasks import ClassificationRunner


runner = ClassificationRunner(...)
runner.scaler = amp.GradScaler()

If you want to define a custom runner that supports mixed precision, do the following.

from torch.cuda import amp
import torch.nn.functional as F
from enchanter.engine import BaseRunner


class CustomRunner(BaseRunner):
    # ...
    def train_step(self, batch):
        x, y = batch
        with amp.autocast():        # REQUIRED
            out = self.model(x)
            loss = F.nll_loss(out, y)
        
        return {"loss": loss}


runner = CustomRunner(...)
runner.scaler = amp.GradScaler()

That is, you can enable AMP by using torch.cuda.amp.autocast() in .train_step(), .val_step() and .test_step().

with-statement training

from comet_ml import Experiment

import torch
import torch.nn as nn
import torch.optim as optim
from torch.utils.data import DataLoader
from sklearn.datasets import load_iris
from tqdm.auto import tqdm

import enchanter.tasks as tasks
import enchanter.engine.modules as modules
import enchanter.addons as addons
import enchanter.addons.layers as layers


experiment = Experiment()
model = layers.MLP([4, 512, 128, 3], addons.mish)
optimizer = optim.Adam(model.parameters())

x, y = load_iris(return_X_y=True)
x = x.astype("float32")
y = y.astype("int64")

train_ds = modules.get_dataset(x, y)
val_ds = modules.get_dataset(x, y)
test_ds = modules.get_dataset(x, y)

train_loader = DataLoader(train_ds, batch_size=32)
val_loader = DataLoader(val_ds, batch_size=32)
test_loader = DataLoader(test_ds, batch_size=32)

runner = tasks.ClassificationRunner(
    model, optimizer, nn.CrossEntropyLoss(), experiment
)

with runner:
    for epoch in tqdm(range(10)):
        with runner.experiment.train():
            for train_batch in train_loader:
                runner.optimizer.zero_grad()
                train_out = runner.train_step(train_batch)
                runner.backward(train_out["loss"])
                runner.update_optimizer()
    
                with runner.experiment.validate(), torch.no_grad():
                    for val_batch in val_loader:
                        val_out = runner.val_step(val_batch)["loss"]
                        runner.experiment.log_metric("val_loss", val_out)

        with runner.experiment.test(), torch.no_grad():
            for test_batch in test_loader:
                test_out = runner.test_step(test_batch)["loss"]
                runner.experiment.log_metric("test_loss", test_out)

# The latest checkpoints (model_state & optim_state) are stored
# in comet.ml after the with statement.

Graph visualization

import torch
from enchanter.utils import visualize
from enchanter.addons.layers import AutoEncoder

x = torch.randn(1, 32)  # [N, in_features]
model = AutoEncoder([32, 16, 8, 2])
visualize.with_netron(model, (x, ))

netron_graph

License

Apache License 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

enchanter-0.9.0.tar.gz (45.3 kB view details)

Uploaded Source

Built Distribution

enchanter-0.9.0-py3-none-any.whl (54.8 kB view details)

Uploaded Python 3

File details

Details for the file enchanter-0.9.0.tar.gz.

File metadata

  • Download URL: enchanter-0.9.0.tar.gz
  • Upload date:
  • Size: 45.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.7 CPython/3.7.11 Linux/5.8.0-1039-azure

File hashes

Hashes for enchanter-0.9.0.tar.gz
Algorithm Hash digest
SHA256 5a23b80e75d5f0fa20d090abdbdcce18f650e9a3997b76008ad2edf29754490b
MD5 a1efeb4ba1928627870e9e8923fcc5ea
BLAKE2b-256 d396e7b2b54be5d168c7a2e0e06a6855d246ae6ad94064c48ce772c367338be5

See more details on using hashes here.

File details

Details for the file enchanter-0.9.0-py3-none-any.whl.

File metadata

  • Download URL: enchanter-0.9.0-py3-none-any.whl
  • Upload date:
  • Size: 54.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.7 CPython/3.7.11 Linux/5.8.0-1039-azure

File hashes

Hashes for enchanter-0.9.0-py3-none-any.whl
Algorithm Hash digest
SHA256 40f46afe806653d97482ce2e8d28bee6cc30ddeb8995e501c6a904a06ff59d08
MD5 4b6974cb96a948f39c9c0d5344632f10
BLAKE2b-256 b380fe7a179da91b395feba8391b0c1638c79730dbfafcb930c310cf6eadcb1f

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page