Skip to main content

PyTorch bindings of the nnutils library

Project description

nnutils-pytorch

Build Status

PyTorch bindings of different neural network-related utilities implemented for CPUs and GPUs (CUDA).

So far, most of the utils are related to my need of working with images of different sizes grouped into batches with padding.

Included functions

Adaptive pooling

Adaptive pooling layers included in several packages like Torch or PyTorch assume that all images in the batch have the same size. This implementation takes into account the size of each individual image within the batch (before padding) to apply the adaptive pooling.

Currently implemented: Average and maximum adaptive pooling.

import torch
from nnutils_pytorch import adaptive_avgpool_2d, adaptive_maxpool_2d

# Two random images, with three channels, 10 pixels height, 12 pixels width
x = torch.rand(2, 3, 10, 12)
# Matrix (N x 2) containing the height and width of each image.
xs = torch.tensor([[10, 6], [6, 12], dtype=torch.int64)

# Pool images to a fixed size, taking into account the original size of each
# image before padding.
#
# Output tensor has shape (2, 3, 3, 5)
y1 = adaptive_avgpool_2d(batch_input=x, output_sizes=(3, 5), batch_sizes=xs)

# Pool a single dimension of the images, taking into account the original
# size of each image before padding. The None dimension is not pooled.
#
# Output tensor has shape (2, 3, 5, 12)
y2 = adaptive_maxpool_2d(x, (5, None), xs)

Important: The implementation assumes that the images are aligned to the top-left corner.

Masking images by size

If you are grouping images of different sizes into batches padded with zeros, you may need to mask the output/input tensors after/before some layers. This layer is very handy (and efficient) in these cases.

import torch
from nnutils_pytorch import mask_image_from_size

# Two random images, with three channels, 10 pixels height, 12 pixels width
x = torch.rand(2, 3, 10, 12)
# Matrix (N x 2) containing the height and width of each image.
xs = torch.tensor([[10, 6], [6, 12], dtype=torch.int64)

# Note: mask_image_from_size is differentiable w.r.t. x
y = mask_image_from_size(x, xs, mask_value=0)  # mask_value is optional.

Important: The implementation assumes that the images are aligned to the top-left corner.

Requirements

  • Python: 3.5, 3.6, 3.7 or 3.8 (tested with version 3.5, 3.6, 3.7 and 3.8).
  • PyTorch >= 1.4.0 (tested with version 1.4.0).
  • C++14 compiler (tested with GCC 7.5.0).
  • For GPU support: CUDA Toolkit.

Installation

The installation process should be pretty straightforward assuming that you have correctly installed the required libraries and tools.

The setup process compiles the package from source, and will compile with CUDA support if this is available for PyTorch.

From Pypi (recommended)

pip install nnutils-pytorch

You may find the package already compiled for different Python, CUDA and CPU configurations in: http://www.jpuigcerver.net/projects/nnutils-pytorch/whl/

For instance, if you want to install the CPU-only version for Python 3.7:

pip install http://www.jpuigcerver.net/projects/nnutils-pytorch/whl/cpu/nnutils_pytorch-1.4.0-cp37-cp37m-linux_x86_64.whl

From GitHub

git clone https://github.com/jpuigcerver/nnutils.git
cd nnutils/pytorch
python setup.py build
python setup.py install

AVX512 related issues

Some compiling problems may arise when using CUDA and newer host compilers with AVX512 instructions. Please, install GCC 7.5 or above and use it as the host compiler for NVCC 10.2. You can simply set the CC and CXX environment variables before the build/install commands:

CC=gcc-4.9 CXX=g++-4.9 pip install nnutils-pytorch

or (if you are using the GitHub source code):

CC=gcc-4.9 CXX=g++-4.9 python setup.py build

Testing

You can test the library once installed using unittest. In particular, run the following commands:

python -m unittest nnutils_pytorch.adaptive_avgpool_2d_test
python -m unittest nnutils_pytorch.adaptive_maxgpool_2d_test
python -m unittest nnutils_pytorch.mask_image_from_size_test

All tests should pass (CUDA tests are only executed if supported).

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nnutils_pytorch-1.4.0.tar.gz (18.6 kB view details)

Uploaded Source

Built Distributions

File details

Details for the file nnutils_pytorch-1.4.0.tar.gz.

File metadata

  • Download URL: nnutils_pytorch-1.4.0.tar.gz
  • Upload date:
  • Size: 18.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/45.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.6.9

File hashes

Hashes for nnutils_pytorch-1.4.0.tar.gz
Algorithm Hash digest
SHA256 c6a666b021ce43e4d59640dd2e9ab3251bea2ebae181b95909f9d535e2a92c8d
MD5 210c9e1bfff7bfe39c7dcad4eb068100
BLAKE2b-256 59ac9d37742af78074497b8bd3d27fd29c0e879c4f6e1deb42c386cb5668c2c0

See more details on using hashes here.

File details

Details for the file nnutils_pytorch-1.4.0-cp38-cp38-manylinux1_x86_64.whl.

File metadata

  • Download URL: nnutils_pytorch-1.4.0-cp38-cp38-manylinux1_x86_64.whl
  • Upload date:
  • Size: 4.4 MB
  • Tags: CPython 3.8
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/45.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.6.9

File hashes

Hashes for nnutils_pytorch-1.4.0-cp38-cp38-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 b661a37e44ae7883b4eccacf9bda6d66f4bc310da934f9a52a41270280db076c
MD5 84be3854137a1f0a11692071bb270b55
BLAKE2b-256 17291d0d95e2f6a157f2e5872bf2bd4f49adb0ee7fc0cc028fa688bae514e2d5

See more details on using hashes here.

File details

Details for the file nnutils_pytorch-1.4.0-cp37-cp37m-manylinux1_x86_64.whl.

File metadata

  • Download URL: nnutils_pytorch-1.4.0-cp37-cp37m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 4.4 MB
  • Tags: CPython 3.7m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/45.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.6.9

File hashes

Hashes for nnutils_pytorch-1.4.0-cp37-cp37m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 39ed1f17c10f4803375ece80425e28969516914683c8c94af47da7b8be85fcb1
MD5 e6454f77e8885fa382562c0732e2e5ed
BLAKE2b-256 d25e8591d4bee7c2528013e6adbedd0422170851e29bc1c27a2cd46e5b481900

See more details on using hashes here.

File details

Details for the file nnutils_pytorch-1.4.0-cp36-cp36m-manylinux1_x86_64.whl.

File metadata

  • Download URL: nnutils_pytorch-1.4.0-cp36-cp36m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 4.4 MB
  • Tags: CPython 3.6m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/45.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.6.9

File hashes

Hashes for nnutils_pytorch-1.4.0-cp36-cp36m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 0076a95612b8819df0b409f5e16acf2a763d199ba5e9d09d5a9a7a83b85e662b
MD5 a704522887eeefbf626d9247003b816b
BLAKE2b-256 b40e23603b0acc56aeb8303073ae5272b59c0e734132154d940e4e110c355c2d

See more details on using hashes here.

File details

Details for the file nnutils_pytorch-1.4.0-cp35-cp35m-manylinux1_x86_64.whl.

File metadata

  • Download URL: nnutils_pytorch-1.4.0-cp35-cp35m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 4.4 MB
  • Tags: CPython 3.5m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/45.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.6.9

File hashes

Hashes for nnutils_pytorch-1.4.0-cp35-cp35m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 eeb58ea68352256d336d3f0f4ddaa3e9e58c80d4b996a6e79a6f445ceae43ef8
MD5 88df22e83960b9206614288a6c14a293
BLAKE2b-256 d9e5955300c89486bd960e396828d311a44ad83e528e6dd21e33672c185e5811

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page