Skip to main content

No project description provided

Project description

Logo

PyPi version

Why use PyTorch Adapt?

PyTorch Adapt provides tools for domain adaptation, a type of machine learning algorithm that repurposes existing models to work in new domains. This library is:

1. Fully featured

Build a complete train/val domain adaptation pipeline in a few lines of code.

2. Modular

Use just the parts that suit your needs, whether it's the algorithms, loss functions, or validation methods.

3. Highly customizable

Customize and combine complex algorithms with ease.

4. Compatible with frameworks

Add additional functionality to your code by using one of the framework wrappers. Converting an algorithm into a PyTorch Lightning module is as simple as wrapping it with Lightning.

Documentation

Getting started

See the examples folder for notebooks you can download or run on Google Colab.

How to...

Use in vanilla PyTorch

from pytorch_adapt.hooks import DANNHook
from pytorch_adapt.utils.common_functions import batch_to_device

# Assuming that models, optimizers, and dataloader are already created.
hook = DANNHook(optimizers)
for data in tqdm(dataloader):
    data = batch_to_device(data, device)
    # Optimization is done inside the hook.
    # The returned loss is for logging.
    loss, _ = hook({}, {**models, **data})

Build complex algorithms

Let's customize DANNHook with:

  • minimum class confusion
  • virtual adversarial training
from pytorch_adapt.hooks import MCCHook, VATHook

# G and C are the Generator and Classifier models
G, C = models["G"], models["C"]
misc = {"combined_model": torch.nn.Sequential(G, C)}
hook = DANNHook(optimizers, post_g=[MCCHook(), VATHook()])
for data in tqdm(dataloader):
    data = batch_to_device(data, device)
    loss, _ = hook({}, {**models, **data, **misc})

Wrap with your favorite PyTorch framework

First, set up the adapter and dataloaders:

from pytorch_adapt.adapters import DANN
from pytorch_adapt.containers import Models
from pytorch_adapt.datasets import DataloaderCreator

models_cont = Models(models)
adapter = DANN(models=models_cont)
dc = DataloaderCreator(num_workers=2)
dataloaders = dc(**datasets)

Then use a framework wrapper:

PyTorch Lightning

import pytorch_lightning as pl
from pytorch_adapt.frameworks.lightning import Lightning

L_adapter = Lightning(adapter)
trainer = pl.Trainer(gpus=1, max_epochs=1)
trainer.fit(L_adapter, dataloaders["train"])

PyTorch Ignite

trainer = Ignite(adapter)
trainer.run(datasets, dataloader_creator=dc)

Check your model's performance

You can do this in vanilla PyTorch:

from pytorch_adapt.validators import SNDValidator

# Assuming predictions have been collected
target_train = {"preds": preds}
validator = SNDValidator()
score = validator(target_train=target_train)

You can also do this during training with a framework wrapper:

PyTorch Lightning

from pytorch_adapt.frameworks.utils import filter_datasets

validator = SNDValidator()
dataloaders = dc(**filter_datasets(datasets, validator))
train_loader = dataloaders.pop("train")

L_adapter = Lightning(adapter, validator=validator)
trainer = pl.Trainer(gpus=1, max_epochs=1)
trainer.fit(L_adapter, train_loader, list(dataloaders.values()))

Pytorch Ignite

from pytorch_adapt.validators import ScoreHistory

validator = ScoreHistory(SNDValidator())
trainer = Ignite(adapter, validator=validator)
trainer.run(datasets, dataloader_creator=dc)

Run the above examples

See this notebook and the examples page for other notebooks.

Installation

Pip

pip install pytorch-adapt

To get the latest dev version:

pip install pytorch-adapt --pre

To use pytorch_adapt.frameworks.lightning:

pip install pytorch-adapt[lightning]

To use pytorch_adapt.frameworks.ignite:

pip install pytorch-adapt[ignite]

Conda

Coming soon...

Dependencies

Required dependencies:

  • numpy
  • torch >= 1.6
  • torchvision
  • torchmetrics
  • pytorch-metric-learning >= 1.0.0.dev5

Acknowledgements

Contributors

Pull requests are welcome!

Advisors

Thank you to Ser-Nam Lim, and my research advisor, Professor Serge Belongie.

Logo

Thanks to Jeff Musgrave for designing the logo.

Code references (in no particular order)

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pytorch-adapt-0.0.50.dev0.tar.gz (85.9 kB view details)

Uploaded Source

Built Distribution

pytorch_adapt-0.0.50.dev0-py3-none-any.whl (137.0 kB view details)

Uploaded Python 3

File details

Details for the file pytorch-adapt-0.0.50.dev0.tar.gz.

File metadata

  • Download URL: pytorch-adapt-0.0.50.dev0.tar.gz
  • Upload date:
  • Size: 85.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.7.1 importlib_metadata/4.10.1 pkginfo/1.8.2 requests/2.27.1 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.9.7

File hashes

Hashes for pytorch-adapt-0.0.50.dev0.tar.gz
Algorithm Hash digest
SHA256 705b9b051694df7f2718bf668109d60a150ed79abe17583e35c738f6ea37f827
MD5 80052a2eb0df60f3dbc46f8030d9a1c5
BLAKE2b-256 60c52d6e89a1f18d37022595f2c871304900046e790bef3bc3de17bf93182846

See more details on using hashes here.

File details

Details for the file pytorch_adapt-0.0.50.dev0-py3-none-any.whl.

File metadata

  • Download URL: pytorch_adapt-0.0.50.dev0-py3-none-any.whl
  • Upload date:
  • Size: 137.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.7.1 importlib_metadata/4.10.1 pkginfo/1.8.2 requests/2.27.1 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.9.7

File hashes

Hashes for pytorch_adapt-0.0.50.dev0-py3-none-any.whl
Algorithm Hash digest
SHA256 9831642b281eed009ec9e1e9ac0aa887e2c4082a14da152c7c66424681f43d15
MD5 c93d607a63fdf5358f93e32c320745cc
BLAKE2b-256 41711f1a3846d28bace0105a9bcaa8b3d0058e93bbe8efc64f450a8df379d692

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page