Torchvision Complementary tool to perform batch and GPU data augmentations.
Project description
Efficient vision data augmentations for CPU/GPU per-sample/batched data.
Under active development, subject to API change
Torchaug
Introduction
Torchaug is a data augmentation library for the Pytorch ecosystem. It is meant to deal efficiently with tensors that are either on CPU or GPU and either per-samples or on batches.
It seeks to improve Torchvision performance that has been implemented over Pytorch and Pillow to, among other things, perform data augmentations. However, because it has been implemented first with per-sample CPU data augmentations in mind, it has several drawbacks to make it efficient:
- For data augmentations on GPU, some CPU/GPU synchronizations cannot be avoided.
- For data augmentations applied on batch, the randomness is sampled for the whole batch and not each sample.
Torchaug removes these issues and is meant to be used complimentary with Torchvision. It follows the same nomenclature as Torchvision with functional augmentations and transforms class wrappers. It is split into two packages:
- transforms for per-sample data augmentations
- batch_transfroms for batched data augmentations.
More details can be found in the documentation.
To be sure to retrieve the same data augmentations as Torchvision, it has been tested on each of its components to match Torchvision outputs.
How to use
-
Install a Pytorch >= 2.0 environment.
-
Install Torchaug.
pip install torchaug
- Import data augmentations either from
torchaug.transforms
ortorchaug.batch_transforms
packages. To ease with handling multiple sequential augmentations, wrappers have been defined.
from torchaug.transforms import (
ImageWrapper,
RandomColorJitter,
RandomGaussianBlur
)
from torchaug.batch_transforms import (
BatchImageWrapper,
BatchRandomColorJitter,
BatchRandomHorizontalFlip
)
transform = ImageWrapper(
[RandomColorJitter(...), RandomGaussianBlur(...)],
)
batch_transform = BatchImageWrapper(
[BatchRandomColorJitter(...), BatchRandomHorizontalFlip(...)],
inplace=True
)
How to contribute
Feel free to contribute to this library by making issues and/or pull requests. For each feature you implement, add tests to make sure it works. Also, make sure to update the documentation.
LICENSE
This project is under the Apache license 2.0.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file torchaug-0.3.1.tar.gz
.
File metadata
- Download URL: torchaug-0.3.1.tar.gz
- Upload date:
- Size: 30.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.9.18
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | ad57612761046b4c51491a1fe76fd406200e6fd62cfd5dc8decd356a9fdf15fd |
|
MD5 | 5c0a8c631366d43380c80cde8f9323ca |
|
BLAKE2b-256 | 80420f9e17c8c1a4ede8f2b38b274f43bec80c6c5ebe6f96b0a05c4a7b7fffd4 |
File details
Details for the file torchaug-0.3.1-py3-none-any.whl
.
File metadata
- Download URL: torchaug-0.3.1-py3-none-any.whl
- Upload date:
- Size: 32.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.9.18
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 0817c831dfc1d9fc0d23fbeac47c2a9bf11c4936336dd5d1fad1c6737fe3e579 |
|
MD5 | 79084bfb4bbed441dc974018e08e9eba |
|
BLAKE2b-256 | d0a65959ca953df9d63210d749ed6a31944f2b5a684599f15cf4460e9e230328 |