Skip to main content

PyTorch bindings of the nnutils library

Project description

nnutils-pytorch

Build Status

PyTorch bindings of different neural network-related utilities implemented for CPUs and GPUs (CUDA).

So far, most of the utils are related to my need of working with images of different sizes grouped into batches with padding.

Included functions

Adaptive pooling

Adaptive pooling layers included in several packages like Torch or PyTorch assume that all images in the batch have the same size. This implementation takes into account the size of each individual image within the batch (before padding) to apply the adaptive pooling.

Currently implemented: Average and maximum adaptive pooling.

import torch
from nnutils_pytorch import adaptive_avgpool_2d, adaptive_maxpool_2d

# Two random images, with three channels, 10 pixels height, 12 pixels width
x = torch.rand(2, 3, 10, 12)
# Matrix (N x 2) containing the height and width of each image.
xs = torch.tensor([[10, 6], [6, 12], dtype=torch.int64)

# Pool images to a fixed size, taking into account the original size of each
# image before padding.
#
# Output tensor has shape (2, 3, 3, 5)
y1 = adaptive_avgpool_2d(batch_input=x, output_sizes=(3, 5), batch_sizes=xs)

# Pool a single dimension of the images, taking into account the original
# size of each image before padding. The None dimension is not pooled.
#
# Output tensor has shape (2, 3, 5, 12)
y2 = adaptive_maxpool_2d(x, (5, None), xs)

Important: The implementation assumes that the images are aligned to the top-left corner.

Masking images by size

If you are grouping images of different sizes into batches padded with zeros, you may need to mask the output/input tensors after/before some layers. This layer is very handy (and efficient) in these cases.

import torch
from nnutils_pytorch import mask_image_from_size

# Two random images, with three channels, 10 pixels height, 12 pixels width
x = torch.rand(2, 3, 10, 12)
# Matrix (N x 2) containing the height and width of each image.
xs = torch.tensor([[10, 6], [6, 12], dtype=torch.int64)

# Note: mask_image_from_size is differentiable w.r.t. x
y = mask_image_from_size(x, xs, mask_value=0)  # mask_value is optional.

Important: The implementation assumes that the images are aligned to the top-left corner.

Requirements

  • Python: 3.5, 3.6, 3.7 or 3.8 (tested with version 3.5, 3.6, 3.7 and 3.8).
  • PyTorch >= 1.5.1 (tested with version 1.5.1).
  • C++14 compiler (tested with GCC 7.5.0).
  • For GPU support: CUDA Toolkit.

Installation

The installation process should be pretty straightforward assuming that you have correctly installed the required libraries and tools.

The setup process compiles the package from source, and will compile with CUDA support if this is available for PyTorch.

From Pypi (recommended)

pip install nnutils-pytorch

You may find the package already compiled for different Python, CUDA and CPU configurations in: http://www.jpuigcerver.net/projects/nnutils-pytorch/whl/

For instance, if you want to install the CPU-only version for Python 3.7:

pip install http://www.jpuigcerver.net/projects/nnutils-pytorch/whl/cpu/nnutils_pytorch-1.5.1-cp37-cp37m-linux_x86_64.whl

From GitHub

git clone https://github.com/jpuigcerver/nnutils.git
cd nnutils/pytorch
python setup.py build
python setup.py install

AVX512 related issues

Some compiling problems may arise when using CUDA and newer host compilers with AVX512 instructions. Please, install GCC 7.5 or above and use it as the host compiler for NVCC 10.2. You can simply set the CC and CXX environment variables before the build/install commands:

CC=gcc-4.9 CXX=g++-4.9 pip install nnutils-pytorch

or (if you are using the GitHub source code):

CC=gcc-4.9 CXX=g++-4.9 python setup.py build

Testing

You can test the library once installed using unittest. In particular, run the following commands:

python -m unittest nnutils_pytorch.adaptive_avgpool_2d_test
python -m unittest nnutils_pytorch.adaptive_maxgpool_2d_test
python -m unittest nnutils_pytorch.mask_image_from_size_test

All tests should pass (CUDA tests are only executed if supported).

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nnutils_pytorch-1.5.1.tar.gz (18.6 kB view details)

Uploaded Source

Built Distributions

File details

Details for the file nnutils_pytorch-1.5.1.tar.gz.

File metadata

  • Download URL: nnutils_pytorch-1.5.1.tar.gz
  • Upload date:
  • Size: 18.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.24.0 setuptools/47.1.1 requests-toolbelt/0.9.1 tqdm/4.46.1 CPython/3.6.9

File hashes

Hashes for nnutils_pytorch-1.5.1.tar.gz
Algorithm Hash digest
SHA256 1454c9682731a72fe9f58034adc1b59a28bce41dc4e2c1776b8a3cff2de6c572
MD5 e309fb69014b1e6f66d36e5bb3917d26
BLAKE2b-256 c95643367ae10337521030bc19ccbed7134937d86d610409078f8366550779cb

See more details on using hashes here.

File details

Details for the file nnutils_pytorch-1.5.1-cp38-cp38-manylinux1_x86_64.whl.

File metadata

  • Download URL: nnutils_pytorch-1.5.1-cp38-cp38-manylinux1_x86_64.whl
  • Upload date:
  • Size: 4.7 MB
  • Tags: CPython 3.8
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.24.0 setuptools/47.1.1 requests-toolbelt/0.9.1 tqdm/4.46.1 CPython/3.6.9

File hashes

Hashes for nnutils_pytorch-1.5.1-cp38-cp38-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 dd3cbfed5b008b867077cb94860e558a175eec39653d800e3b4d9eaeece73535
MD5 2f4fa0f0479af3e3af52f9994c5a4b6c
BLAKE2b-256 1f95f941fc4f4c524515c26825f93da9dd401dcf34bb01a1a377c05d2a5af7cf

See more details on using hashes here.

File details

Details for the file nnutils_pytorch-1.5.1-cp37-cp37m-manylinux1_x86_64.whl.

File metadata

  • Download URL: nnutils_pytorch-1.5.1-cp37-cp37m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 4.7 MB
  • Tags: CPython 3.7m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.24.0 setuptools/47.1.1 requests-toolbelt/0.9.1 tqdm/4.46.1 CPython/3.6.9

File hashes

Hashes for nnutils_pytorch-1.5.1-cp37-cp37m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 78ade76db7e96ebd79c3073c3e2b9d47726f4c83e18487c768660a6108a4426f
MD5 c225d2b73a7d00bedd15c6391950ca79
BLAKE2b-256 20c4d4ef33bc584887a63078c0f55510306d5c524f95fc81c10ef99e23b22fc0

See more details on using hashes here.

File details

Details for the file nnutils_pytorch-1.5.1-cp36-cp36m-manylinux1_x86_64.whl.

File metadata

  • Download URL: nnutils_pytorch-1.5.1-cp36-cp36m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 4.7 MB
  • Tags: CPython 3.6m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.24.0 setuptools/47.1.1 requests-toolbelt/0.9.1 tqdm/4.46.1 CPython/3.6.9

File hashes

Hashes for nnutils_pytorch-1.5.1-cp36-cp36m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 532aeb41f5f3f542c0ea26c38d23708b070d33c2d011f0a7db7daefed3c47138
MD5 e81227bf0229b8b65190e19293802886
BLAKE2b-256 21bee284458f2cfbdd5742e5e4267fad536e62f3f3b89b2994768e9dffa12893

See more details on using hashes here.

File details

Details for the file nnutils_pytorch-1.5.1-cp35-cp35m-manylinux1_x86_64.whl.

File metadata

  • Download URL: nnutils_pytorch-1.5.1-cp35-cp35m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 4.7 MB
  • Tags: CPython 3.5m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.24.0 setuptools/47.1.1 requests-toolbelt/0.9.1 tqdm/4.46.1 CPython/3.6.9

File hashes

Hashes for nnutils_pytorch-1.5.1-cp35-cp35m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 db8ffa71ac9feb6a85ef2c58d561371607681ffd19539f9a9cb220375a452063
MD5 d777b9864c1382df5cb35f29005687f6
BLAKE2b-256 2b9cf37bef2fd3a737f92fa9bfbbaaa6ed0b3a9fd5ac66afe1811afa7857832f

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page