Skip to main content

No project description provided

Project description

Logo

PyPi version

News

November 19: Git repo is now public

Documentation

Google Colab Examples

See the examples folder for notebooks you can download or run on Google Colab.

Overview

This library consists of 11 modules:

Module Description
Adapters Wrappers for training and inference steps
Containers Dictionaries for simplifying object creation
Datasets Commonly used datasets and tools for domain adaptation
Frameworks Wrappers for training/testing pipelines
Hooks Modular building blocks for domain adaptation algorithms
Layers Loss functions and helper layers
Meta Validators Post-processing of metrics, for hyperparameter optimization
Models Architectures used for benchmarking and in examples
Utils Various tools
Validators Metrics for determining and estimating accuracy
Weighters Functions for weighting losses

How to...

Use in vanilla PyTorch

from pytorch_adapt.hooks import DANNHook
from pytorch_adapt.utils.common_functions import batch_to_device

# Assuming that models, optimizers, and dataloader are already created.
hook = DANNHook(optimizers)
for data in dataloader:
    data = batch_to_device(data, device)
    # Optimization is done inside the hook.
    # The returned loss is for logging.
    loss, _ = hook({}, {**models, **data})

Build complex algorithms

Let's customize DANNHook with:

  • virtual adversarial training
  • entropy conditioning
from pytorch_adapt.hooks import EntropyReducer, MeanReducer, VATHook

# G and C are the Generator and Classifier models
misc = {"combined_model": torch.nn.Sequential(G, C)}
reducer = EntropyReducer(
    apply_to=["src_domain_loss", "target_domain_loss"], default_reducer=MeanReducer()
)
hook = DANNHook(optimizers, reducer=reducer, post_g=[VATHook()])
for data in dataloader:
    data = batch_to_device(data, device)
    loss, _ = hook({}, {**models, **data, **misc})

Remove some boilerplate

Adapters and containers can simplify object creation.

import torch

from pytorch_adapt.adapters import DANN
from pytorch_adapt.containers import Models, Optimizers

# Assume G, C and D are existing models
models = Models(models)
# Override the default optimizer for G and C
optimizers = Optimizers((torch.optim.Adam, {"lr": 0.123}), keys=["G", "C"])
adapter = DANN(models=models, optimizers=optimizers)

for data in dataloader:
    adapter.training_step(data, device)

Wrap with your favorite PyTorch framework

For additional functionality, adapters can be wrapped with a framework (currently just PyTorch Ignite.)

from pytorch_adapt.frameworks import Ignite

wrapped_adapter = Ignite(adapter)
wrapped_adapter.run(datasets)

Wrappers for other frameworks (e.g. PyTorch Lightning and Catalyst) are coming soon.

Check accuracy of your model

You can do this in vanilla PyTorch:

from pytorch_adapt.validators import SNDValidator

# Assuming predictions have been collected
target_train = {"preds": preds}
validator = SNDValidator()
score = validator.score(epoch=1, target_train=target_train)

You can also do this using a framework wrapper:

from pytorch_adapt.validators import SNDValidator

validator = SNDValidator()
wrapped_adapter.run(datasets, validator=validator)

Load a toy dataset

import torch

from pytorch_adapt.datasets import get_mnist_mnistm

# mnist is the source domain
# mnistm is the target domain
datasets = get_mnist_mnistm(["mnist"], ["mnistm"], ".", download=True)
dataloader = torch.utils.data.DataLoader(
    datasets["train"], batch_size=32, num_workers=2
)

Run the above examples

See this notebook and the examples page for other notebooks.

Installation

Pip

pip install pytorch-adapt

To get the latest dev version:

pip install pytorch-adapt --pre

Conda

Coming soon...

Dependencies

Coming soon...

Acknowledgements

Contributors

Pull requests are welcome!

Advisors

Thank you to Ser-Nam Lim, and my research advisor, Professor Serge Belongie.

Logo

Thanks to Jeff Musgrave for designing the logo.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pytorch-adapt-0.0.32.tar.gz (77.5 kB view details)

Uploaded Source

Built Distribution

pytorch_adapt-0.0.32-py3-none-any.whl (124.2 kB view details)

Uploaded Python 3

File details

Details for the file pytorch-adapt-0.0.32.tar.gz.

File metadata

  • Download URL: pytorch-adapt-0.0.32.tar.gz
  • Upload date:
  • Size: 77.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.6.1 pkginfo/1.7.1 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.61.0 CPython/3.8.10

File hashes

Hashes for pytorch-adapt-0.0.32.tar.gz
Algorithm Hash digest
SHA256 34192bd24c5b07e56b607c9eb436c49535d2677704a9b311491c3cff0491c059
MD5 5cf35650373ecfad972f6de152464cce
BLAKE2b-256 c2d21823436476b25ec7a1a284a5d7a09b7178829bc224599f9362cb0ca807b4

See more details on using hashes here.

File details

Details for the file pytorch_adapt-0.0.32-py3-none-any.whl.

File metadata

  • Download URL: pytorch_adapt-0.0.32-py3-none-any.whl
  • Upload date:
  • Size: 124.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.6.1 pkginfo/1.7.1 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.61.0 CPython/3.8.10

File hashes

Hashes for pytorch_adapt-0.0.32-py3-none-any.whl
Algorithm Hash digest
SHA256 b7e143979719b205692a17ee7167bf913c7489422bea6a0e9bfa9ad513782349
MD5 3a8d2d57415bdc579b12499340f41aa9
BLAKE2b-256 18c9fce5d58cb0c2f980d0de238fa0ec47116e1db169f59f6d8d4f660dd7fa70

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page