No project description provided
Project description
News
Documentation
Google Colab Examples
See the examples folder for notebooks you can download or run on Google Colab.
Overview
This library consists of 11 modules:
Module | Description |
---|---|
Adapters | Wrappers for training and inference steps |
Containers | Dictionaries for simplifying object creation |
Datasets | Commonly used datasets and tools for domain adaptation |
Frameworks | Wrappers for training/testing pipelines |
Hooks | Modular building blocks for domain adaptation algorithms |
Layers | Loss functions and helper layers |
Meta Validators | Post-processing of metrics, for hyperparameter optimization |
Models | Architectures used for benchmarking and in examples |
Utils | Various tools |
Validators | Metrics for determining and estimating accuracy |
Weighters | Functions for weighting losses |
How to...
Use in vanilla PyTorch
from pytorch_adapt.hooks import DANNHook
from pytorch_adapt.utils.common_functions import batch_to_device
# Assuming that models, optimizers, and dataloader are already created.
hook = DANNHook(optimizers)
for data in dataloader:
data = batch_to_device(data, device)
# Optimization is done inside the hook.
# The returned loss is for logging.
loss, _ = hook({}, {**models, **data})
Build complex algorithms
Let's customize DANNHook
with:
- virtual adversarial training
- entropy conditioning
from pytorch_adapt.hooks import EntropyReducer, MeanReducer, VATHook
# G and C are the Generator and Classifier models
misc = {"combined_model": torch.nn.Sequential(G, C)}
reducer = EntropyReducer(
apply_to=["src_domain_loss", "target_domain_loss"], default_reducer=MeanReducer()
)
hook = DANNHook(optimizers, reducer=reducer, post_g=[VATHook()])
for data in dataloader:
data = batch_to_device(data, device)
loss, _ = hook({}, {**models, **data, **misc})
Remove some boilerplate
Adapters and containers can simplify object creation.
import torch
from pytorch_adapt.adapters import DANN
from pytorch_adapt.containers import Models, Optimizers
# Assume G, C and D are existing models
models = Models(models)
# Override the default optimizer for G and C
optimizers = Optimizers((torch.optim.Adam, {"lr": 0.123}), keys=["G", "C"])
adapter = DANN(models=models, optimizers=optimizers)
for data in dataloader:
adapter.training_step(data, device)
Wrap with your favorite PyTorch framework
For additional functionality, adapters can be wrapped with a framework (currently just PyTorch Ignite.)
from pytorch_adapt.frameworks import Ignite
wrapped_adapter = Ignite(adapter)
wrapped_adapter.run(datasets)
Wrappers for other frameworks (e.g. PyTorch Lightning and Catalyst) is coming soon.
Check accuracy of your model
You can do this in vanilla PyTorch:
from pytorch_adapt.validators import SNDValidator
# Assuming predictions have been collected
target_train = {"preds": preds}
validator = SNDValidator()
score = validator.score(epoch=1, target_train=target_train)
You can also do this using a framework wrapper:
from pytorch_adapt.validators import SNDValidator
validator = SNDValidator()
wrapped_adapter.run(datasets, validator=validator)
Load a toy dataset
import torch
from pytorch_adapt.datasets import get_mnist_mnistm
# mnist is the source domain
# mnistm is the target domain
datasets = get_mnist_mnistm(["mnist"], ["mnistm"], ".")
dataloader = torch.utils.data.DataLoader(
datasets["train"], batch_size=32, num_workers=2
)
Run the above examples
See this notebook and the examples page for other notebooks.
Installation
Pip
pip install pytorch-adapt
To get the latest dev version:
pip install pytorch-adapt --pre
Conda
Coming soon...
Dependencies
Coming soon...
Acknowledgements
Contributors
Pull requests are welcome!
Advisors
Thank you to Ser-Nam Lim, and my research advisor, Professor Serge Belongie.
Logo
Thanks to Jeff Musgrave for designing the logo.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file pytorch-adapt-0.0.15.tar.gz
.
File metadata
- Download URL: pytorch-adapt-0.0.15.tar.gz
- Upload date:
- Size: 74.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.4.2 importlib_metadata/4.6.1 pkginfo/1.7.1 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.61.0 CPython/3.8.10
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 1d9392331bc74b7c7e80e70e1da635da05e4a5115e6d4342f0067814814ac50e |
|
MD5 | e50726a7600c0a3e1d5b2526d99da6da |
|
BLAKE2b-256 | c339db464bb141b88ca8c00d4c534ef6915cbca3fc0d1037a0161065e3ea6512 |
File details
Details for the file pytorch_adapt-0.0.15-py3-none-any.whl
.
File metadata
- Download URL: pytorch_adapt-0.0.15-py3-none-any.whl
- Upload date:
- Size: 120.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.4.2 importlib_metadata/4.6.1 pkginfo/1.7.1 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.61.0 CPython/3.8.10
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 17d041e26516e6fa4763cf28fe8adb4347f6dd092eb81efeb08e9101769db008 |
|
MD5 | 58a19f017c349a92207ba1f83e3b5e73 |
|
BLAKE2b-256 | 952c1c0defa4fb278f6c1396dd8dcc9dc90c3ef8d39c2612859b4f0a8ed01b99 |