Skip to main content

Model Ablation Tool-Kit

Project description

Ablator Image

Documentation Twitter Follow Discord Slack

Version Python 3.10 codecov

🚀 ABLATOR is a DISTRIBUTED EXECUTION FRAMEWORK designed to enhance ablation studies in complex machine learning models. It automates the process of configuration and conducts multiple experiments in parallel.

What are Ablation Studies?

It involves removing specific parts of a neural network architecture or changing different aspects of the training process to examine their contributions to the model's performance.

Why ABLATOR?

As machine learning models grow in complexity, the number of components that need to be ablated also increases. This consequently expands the search space of possible configurations, requiring an efficient approach to horizontally scale multiple parallel experimental trials. ABLATOR is a tool that aids in the horizontal scaling of experimental trials.

Instead of manually configuring and conducting multiple experiments with various hyperparameter settings, ABLATOR automates this process. It initializes experiments based on different hyperparameter configurations, tracks the state of each experiment, and provides experiment persistence on the cloud.

Key Features

  • It is a tool that simplifies the process of prototyping of models.
  • It streamlines model experimentation and evaluation.
  • It offers a flexible configuration system.
  • It facilitates result interpretation through visualization.
  • "Auto-Trainer" feature reduces redundant coding tasks.

ABLATOR vs. Without ABLATOR

Left, ABLATOR efficiently conducts multiple trials in parallel based and log the experiment results.

Right, manually, one would need to run trials sequentially, demanding more effort and independent analysis.

Comparison of ABLATOR and Manual Proces

Install

For MacOS and Linux systems directly install via pip.

pip install ablator

If you are using Windows, you will need to install WSL using the official from Microsoft.

WSL is a Linux subsystem and for ABLATOR purposes is identical to using Linux.

Basic Concepts

1. Create your Configuration

from torch import nn
import torch
from ablator import (
    ModelConfig,
    ModelWrapper,
    OptimizerConfig,
    TrainConfig,
    configclass,
    Literal,
    ParallelTrainer,
    SearchSpace,
)
from ablator.config.mp import ParallelConfig


@configclass
class TrainConfig(TrainConfig):
    dataset: str = "random"
    dataset_size: int


@configclass
class ModelConfig(ModelConfig):
    layer: Literal["layer_a", "layer_b"] = "layer_a"


@configclass
class ParallelConfig(ParallelConfig):
    model_config: ModelConfig
    train_config: TrainConfig


config = ParallelConfig(
    experiment_dir="ablator-exp",
    train_config=TrainConfig(
        batch_size=128,
        epochs=2,
        dataset_size=100,
        optimizer_config=OptimizerConfig(name="sgd", arguments={"lr": 0.1}),
        scheduler_config=None,
    ),
    model_config=ModelConfig(),
    device="cpu",
    search_space={
        "model_config.layer": SearchSpace(categorical_values=["layer_a", "layer_b"])
    },
    total_trials=2,
)

2. Define your Model

class SimpleModel(nn.Module):
    def __init__(self, config: ModelConfig) -> None:
        super().__init__()
        if config.layer == "layer_a":
            self.param = nn.Parameter(torch.ones(100, 1))
        else:
            self.param = nn.Parameter(torch.randn(200, 1))

    def forward(self, x: torch.Tensor):
        x = self.param
        return {"preds": x}, x.sum().abs()


class SimpleWrapper(ModelWrapper):
    def make_dataloader_train(self, run_config: ParallelConfig):
        dl = [torch.rand(100) for i in range(run_config.train_config.dataset_size)]
        return dl

    def make_dataloader_val(self, run_config: ParallelConfig):
        dl = [torch.rand(100) for i in range(run_config.train_config.dataset_size)]
        return dl

3. Launch 🚀

mywrapper = SimpleWrapper(SimpleModel)
with ParallelTrainer(mywrapper, config) as ablator:
    ablator.launch(".")

Learn More about ABLATOR Modules

Configuration Icon Process Icon Results Icon Analysis Icon
Configuration Module Training Module Experiment and Metrics Module Analysis Module

Tutorials

Explore a variety of tutorials and examples on how to utilize ABLATOR. Ready to dive in? 👉 Ablation Tutorials

Contribution Guidelines

ABLATOR is open source, and we value contributions from our community! Check out our Development Guide for details on our development process and insights into the internals of the ABLATOR library.

For any bugs or feature requests related to ABLATOR, please visit our GitHub Issues or reach out to slack

Ablator Community

Platform Purpose Support Level
GitHub Issues To report issues or suggest new features. ABLATOR Team
Slack To collaborate with fellow ABLATOR users. Community
Discord To inquire about ABLATOR usage and collaborate with other ABLATOR enthusiasts. Community
Twitter For staying up-to-date on new features of Ablator. ABLATOR Team

References

@inproceedings{fostiropoulos2023ablator,
  title={ABLATOR: Robust Horizontal-Scaling of Machine Learning Ablation Experiments},
  author={Fostiropoulos, Iordanis and Itti, Laurent},
  booktitle={AutoML Conference 2023 (ABCD Track)},
  year={2023}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ablator-0.0.1b3.tar.gz (231.6 kB view hashes)

Uploaded Source

Built Distributions

ablator-0.0.1b3-py3-none-manylinux2014_x86_64.whl (261.1 kB view hashes)

Uploaded Python 3

ablator-0.0.1b3-py3-none-manylinux2014_s390x.whl (261.1 kB view hashes)

Uploaded Python 3

ablator-0.0.1b3-py3-none-manylinux2014_ppc64le.whl (261.1 kB view hashes)

Uploaded Python 3

ablator-0.0.1b3-py3-none-manylinux2014_ppc64.whl (261.1 kB view hashes)

Uploaded Python 3

ablator-0.0.1b3-py3-none-manylinux2014_i686.whl (261.1 kB view hashes)

Uploaded Python 3

ablator-0.0.1b3-py3-none-manylinux2014_armv7l.whl (261.1 kB view hashes)

Uploaded Python 3

ablator-0.0.1b3-py3-none-manylinux2014_aarch64.whl (261.1 kB view hashes)

Uploaded Python 3

ablator-0.0.1b3-py3-none-macosx_11_0_arm64.whl (261.1 kB view hashes)

Uploaded Python 3 macOS 11.0+ ARM64

ablator-0.0.1b3-py3-none-macosx_10_9_x86_64.whl (261.1 kB view hashes)

Uploaded Python 3 macOS 10.9+ x86-64

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page