Skip to main content

No project description provided

Project description

Logo

PyPi version

What does it do?

PyTorch Adapt provides tools for domain adaptation. This is a type of machine learning algorithm used for repurposing existing models to work in new domains.

Benefits

1. Fully featured

Build a complete train/val domain adaptation pipeline in a few lines of code.

2. Modular

Use just the parts that suit your needs, whether it's the algorithms, loss functions, or validation methods.

3. Highly customizable

Customize and combine complex algorithms with ease.

4. Compatible with frameworks

Add additional functionality to your code by using one of the framework wrappers. Converting an algorithm into a PyTorch Lightning module is as simple as wrapping it with Lightning.

Documentation

Getting started

See the examples folder for notebooks you can download or run on Google Colab.

How to...

Use in vanilla PyTorch

from pytorch_adapt.hooks import DANNHook
from pytorch_adapt.utils.common_functions import batch_to_device

# Assuming that models, optimizers, and dataloader are already created.
hook = DANNHook(optimizers)
for data in tqdm(dataloader):
    data = batch_to_device(data, device)
    # Optimization is done inside the hook.
    # The returned loss is for logging.
    loss, _ = hook({}, {**models, **data})

Build complex algorithms

Let's customize DANNHook with:

  • minimum class confusion
  • virtual adversarial training
from pytorch_adapt.hooks import MCCHook, VATHook

# G and C are the Generator and Classifier models
G, C = models["G"], models["C"]
misc = {"combined_model": torch.nn.Sequential(G, C)}
hook = DANNHook(optimizers, post_g=[MCCHook(), VATHook()])
for data in tqdm(dataloader):
    data = batch_to_device(data, device)
    loss, _ = hook({}, {**models, **data, **misc})

Wrap with your favorite PyTorch framework

First, set up the adapter and dataloaders:

from pytorch_adapt.adapters import DANN
from pytorch_adapt.containers import Models
from pytorch_adapt.datasets import DataloaderCreator

models_cont = Models(models)
adapter = DANN(models=models_cont)
dc = DataloaderCreator(num_workers=2)
dataloaders = dc(**datasets)

Then use a framework wrapper:

PyTorch Lightning

import pytorch_lightning as pl
from pytorch_adapt.frameworks.lightning import Lightning

L_adapter = Lightning(adapter)
trainer = pl.Trainer(gpus=1, max_epochs=1)
trainer.fit(L_adapter, dataloaders["train"])

PyTorch Ignite

trainer = Ignite(adapter)
trainer.run(datasets, dataloader_creator=dc)

Check your model's performance

You can do this in vanilla PyTorch:

from pytorch_adapt.validators import SNDValidator

# Assuming predictions have been collected
target_train = {"preds": preds}
validator = SNDValidator()
score = validator.score(target_train=target_train)

You can also do this during training with a framework wrapper:

Lightning

from pytorch_adapt.frameworks.utils import filter_datasets

validator = SNDValidator()
dataloaders = dc(**filter_datasets(datasets, validator))
train_loader = dataloaders.pop("train")

L_adapter = Lightning(adapter, validator=validator)
trainer = pl.Trainer(gpus=1, max_epochs=1)
trainer.fit(L_adapter, train_loader, list(dataloaders.values()))

Ignite

from pytorch_adapt.validators import ScoreHistory

validator = ScoreHistory(SNDValidator())
trainer = Ignite(adapter, validator=validator)
trainer.run(datasets, dataloader_creator=dc)

Run the above examples

See this notebook and the examples page for other notebooks.

Installation

Pip

pip install pytorch-adapt

To get the latest dev version:

pip install pytorch-adapt --pre

To use pytorch_adapt.frameworks.lightning:

pip install pytorch-adapt[lightning]

To use pytorch_adapt.frameworks.ignite:

pip install pytorch-adapt[ignite]

Conda

Coming soon...

Dependencies

Required dependencies:

  • numpy
  • torch >= 1.6
  • torchvision
  • torchmetrics
  • pytorch-metric-learning >= 1.0.0.dev5

Acknowledgements

Contributors

Pull requests are welcome!

Advisors

Thank you to Ser-Nam Lim, and my research advisor, Professor Serge Belongie.

Logo

Thanks to Jeff Musgrave for designing the logo.

Code references (in no particular order)

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pytorch-adapt-0.0.38.tar.gz (81.1 kB view details)

Uploaded Source

Built Distribution

pytorch_adapt-0.0.38-py3-none-any.whl (130.5 kB view details)

Uploaded Python 3

File details

Details for the file pytorch-adapt-0.0.38.tar.gz.

File metadata

  • Download URL: pytorch-adapt-0.0.38.tar.gz
  • Upload date:
  • Size: 81.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.6.1 pkginfo/1.7.1 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.61.0 CPython/3.8.10

File hashes

Hashes for pytorch-adapt-0.0.38.tar.gz
Algorithm Hash digest
SHA256 36207516e1b2a9a6b602b980117a903b10e521dd876f33e1bd63bcc3189b730c
MD5 1e9b313a0aae593ae7be10e90d0f516e
BLAKE2b-256 8feb3b9a14182ddfef0cdb1a88715608481104c54bd3a55eeeca41e5c86f4409

See more details on using hashes here.

File details

Details for the file pytorch_adapt-0.0.38-py3-none-any.whl.

File metadata

  • Download URL: pytorch_adapt-0.0.38-py3-none-any.whl
  • Upload date:
  • Size: 130.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.6.1 pkginfo/1.7.1 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.61.0 CPython/3.8.10

File hashes

Hashes for pytorch_adapt-0.0.38-py3-none-any.whl
Algorithm Hash digest
SHA256 b3fc07abecf667251f72a4e0846b00661e4697294d26a10b1f460b227cd418f2
MD5 7834ae04c0b3eb8164d4dc961f66f428
BLAKE2b-256 12a173f8a749cf11724233d33034650263f04414424499dc0b13a7a15e4ae462

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page