Skip to main content

No project description provided

Project description

Logo

PyPi version

News

November 19: Git repo is now public

Documentation

Google Colab Examples

See the examples folder for notebooks you can download or run on Google Colab.

Overview

This library consists of 11 modules:

Module Description
Adapters Wrappers for training and inference steps
Containers Dictionaries for simplifying object creation
Datasets Commonly used datasets and tools for domain adaptation
Frameworks Wrappers for training/testing pipelines
Hooks Modular building blocks for domain adaptation algorithms
Layers Loss functions and helper layers
Meta Validators Post-processing of metrics, for hyperparameter optimization
Models Architectures used for benchmarking and in examples
Utils Various tools
Validators Metrics for determining and estimating accuracy
Weighters Functions for weighting losses

How to...

Use in vanilla PyTorch

from pytorch_adapt.hooks import DANNHook
from pytorch_adapt.utils.common_functions import batch_to_device

# Assuming that models, optimizers, and dataloader are already created.
hook = DANNHook(optimizers)
for data in tqdm(dataloader):
    data = batch_to_device(data, device)
    # Optimization is done inside the hook.
    # The returned loss is for logging.
    loss, _ = hook({}, {**models, **data})

Build complex algorithms

Let's customize DANNHook with:

  • virtual adversarial training
  • entropy conditioning
from pytorch_adapt.hooks import EntropyReducer, MeanReducer, VATHook

# G and C are the Generator and Classifier models
misc = {"combined_model": torch.nn.Sequential(G, C)}
reducer = EntropyReducer(
    apply_to=["src_domain_loss", "target_domain_loss"], default_reducer=MeanReducer()
)
hook = DANNHook(optimizers, reducer=reducer, post_g=[VATHook()])
for data in tqdm(dataloader):
    data = batch_to_device(data, device)
    loss, _ = hook({}, {**models, **data, **misc})

Wrap with your favorite PyTorch framework

For additional functionality, adapters can be wrapped with a framework (currently just PyTorch Ignite).

from pytorch_adapt.adapters import DANN
from pytorch_adapt.containers import Models, Optimizers
from pytorch_adapt.datasets import DataloaderCreator
from pytorch_adapt.frameworks.ignite import Ignite

# Assume G, C and D are existing models
models_cont = Models(models)
# Override the default optimizer for G and C
optimizers_cont = Optimizers((torch.optim.Adam, {"lr": 0.123}), keys=["G", "C"])
adapter = DANN(models=models_cont, optimizers=optimizers_cont)

dc = DataloaderCreator(num_workers=2)
trainer = Ignite(adapter)
trainer.run(datasets, dataloader_creator=dc)

Wrappers for other frameworks (e.g. PyTorch Lightning and Catalyst) are planned to be added.

Check your model's performance

You can do this in vanilla PyTorch:

from pytorch_adapt.validators import SNDValidator

# Assuming predictions have been collected
target_train = {"preds": preds}
validator = SNDValidator()
score = validator.score(epoch=1, target_train=target_train)

You can also do this using a framework wrapper:

validator = SNDValidator()
trainer = Ignite(adapter, validator=validator)
trainer.run(datasets, dataloader_creator=dc)

Run the above examples

See this notebook and the examples page for other notebooks.

Installation

Pip

pip install pytorch-adapt

To get the latest dev version:

pip install pytorch-adapt --pre

To use pytorch_adapt.frameworks.ignite:

pip install pytorch-adapt[ignite]

Conda

Coming soon...

Dependencies

Required dependencies:

  • numpy
  • torch >= 1.6
  • torchvision
  • torchmetrics
  • pytorch-metric-learning >= 1.0.0.dev5

Acknowledgements

Contributors

Pull requests are welcome!

Advisors

Thank you to Ser-Nam Lim, and my research advisor, Professor Serge Belongie.

Logo

Thanks to Jeff Musgrave for designing the logo.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pytorch-adapt-0.0.35.tar.gz (77.7 kB view details)

Uploaded Source

Built Distribution

pytorch_adapt-0.0.35-py3-none-any.whl (124.4 kB view details)

Uploaded Python 3

File details

Details for the file pytorch-adapt-0.0.35.tar.gz.

File metadata

  • Download URL: pytorch-adapt-0.0.35.tar.gz
  • Upload date:
  • Size: 77.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.6.1 pkginfo/1.7.1 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.61.0 CPython/3.8.10

File hashes

Hashes for pytorch-adapt-0.0.35.tar.gz
Algorithm Hash digest
SHA256 c3bf10b25b3d4ba1d45220684aa222f9e2c99c913f3cb25fe3d2f961c344186d
MD5 345de564bd3c266d5ee445539a3d64c8
BLAKE2b-256 96d1df8b6c496a639bb240b988aa3be86db1863137243968d5744ba36c0a3d12

See more details on using hashes here.

File details

Details for the file pytorch_adapt-0.0.35-py3-none-any.whl.

File metadata

  • Download URL: pytorch_adapt-0.0.35-py3-none-any.whl
  • Upload date:
  • Size: 124.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.6.1 pkginfo/1.7.1 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.61.0 CPython/3.8.10

File hashes

Hashes for pytorch_adapt-0.0.35-py3-none-any.whl
Algorithm Hash digest
SHA256 7d3de6b9ed5faf298c26028477904a95d67f39e15f542ae29505264448d98727
MD5 7f0e1ed94bd7f4b75d8fc3a225c12477
BLAKE2b-256 f5fb3169a7e03f3c445dadc1364853f244b9846fc29c295805ee7dc05db2c2d6

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page