Skip to main content

PyTorch bindings of the nnutils library

Project description

nnutils-pytorch

Build Status

PyTorch bindings of different neural network-related utilities implemented for CPUs and GPUs (CUDA).

So far, most of the utils are related to my need of working with images of different sizes grouped into batches with padding.

Included functions

Adaptive pooling

Adaptive pooling layers included in several packages like Torch or PyTorch assume that all images in the batch have the same size. This implementation takes into account the size of each individual image within the batch (before padding) to apply the adaptive pooling.

Currently implemented: Average and maximum adaptive pooling.

import torch
from nnutils_pytorch import adaptive_avgpool_2d, adaptive_maxpool_2d

# Two random images, with three channels, 10 pixels height, 12 pixels width
x = torch.rand(2, 3, 10, 12)
# Matrix (N x 2) containing the height and width of each image.
xs = torch.tensor([[10, 6], [6, 12], dtype=torch.int64)

# Pool images to a fixed size, taking into account the original size of each
# image before padding.
#
# Output tensor has shape (2, 3, 3, 5)
y1 = adaptive_avgpool_2d(batch_input=x, output_sizes=(3, 5), batch_sizes=xs)

# Pool a single dimension of the images, taking into account the original
# size of each image before padding. The None dimension is not pooled.
#
# Output tensor has shape (2, 3, 5, 12)
y2 = adaptive_maxpool_2d(x, (5, None), xs)

Important: The implementation assumes that the images are aligned to the top-left corner.

Masking images by size

If you are grouping images of different sizes into batches padded with zeros, you may need to mask the output/input tensors after/before some layers. This layer is very handy (and efficient) in these cases.

import torch
from nnutils_pytorch import mask_image_from_size

# Two random images, with three channels, 10 pixels height, 12 pixels width
x = torch.rand(2, 3, 10, 12)
# Matrix (N x 2) containing the height and width of each image.
xs = torch.tensor([[10, 6], [6, 12], dtype=torch.int64)

# Note: mask_image_from_size is differentiable w.r.t. x
y = mask_image_from_size(x, xs, mask_value=0)  # mask_value is optional.

Important: The implementation assumes that the images are aligned to the top-left corner.

Requirements

  • Python: 2.7, 3.5, 3.6 or 3.7 (tested with version 2.7, 3.5, 3.6 and 3.7).
  • PyTorch (tested with version 1.0.0).
  • C++11 compiler (tested with GCC 4.8.2, 5.5.0, 6.4.0).
  • For GPU support: CUDA Toolkit.

Installation

The installation process should be pretty straightforward assuming that you have correctly installed the required libraries and tools.

The setup process compiles the package from source, and will compile with CUDA support if this is available for PyTorch.

From Pypi (recommended)

pip install nnutils-pytorch

You may find the package already compiled for different Python, CUDA and CPU configurations in: http://www.jpuigcerver.net/projects/nnutils-pytorch/whl/

For instance, if you want to install the CPU-only version for Python 3.7:

pip install http://www.jpuigcerver.net/projects/nnutils-pytorch/whl/cpu/nnutils_pytorch-0.3.0-cp37-cp37m-linux_x86_64.whl

From GitHub

git clone https://github.com/jpuigcerver/nnutils.git
cd nnutils/pytorch
python setup.py build
python setup.py install

AVX512 related issues

Some compiling problems may arise when using CUDA and newer host compilers with AVX512 instructions. Please, install GCC 4.9 and use it as the host compiler for NVCC. You can simply set the CC and CXX environment variables before the build/install commands:

CC=gcc-4.9 CXX=g++-4.9 pip install nnutils-pytorch

or (if you are using the GitHub source code):

CC=gcc-4.9 CXX=g++-4.9 python setup.py build

Testing

You can test the library once installed using unittest. In particular, run the following commands:

python -m unittest nnutils_pytorch.adaptive_avgpool_2d_test
python -m unittest nnutils_pytorch.adaptive_maxgpool_2d_test
python -m unittest nnutils_pytorch.mask_image_from_size_test

All tests should pass (CUDA tests are only executed if supported).

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nnutils_pytorch-0.3.0.tar.gz (18.9 kB view details)

Uploaded Source

Built Distributions

File details

Details for the file nnutils_pytorch-0.3.0.tar.gz.

File metadata

  • Download URL: nnutils_pytorch-0.3.0.tar.gz
  • Upload date:
  • Size: 18.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.12.1 pkginfo/1.4.2 requests/2.9.1 setuptools/20.7.0 requests-toolbelt/0.8.0 tqdm/4.19.5 CPython/3.5.2

File hashes

Hashes for nnutils_pytorch-0.3.0.tar.gz
Algorithm Hash digest
SHA256 dfb271189ed704b0900501e90724c4b991d07ea959781f95f2a0666ca5380e1f
MD5 d7ef89042f172d34929c12062aa09a57
BLAKE2b-256 af7646566834414298b051c1a7a438b78bc67db2225cabafa44c114102c450ef

See more details on using hashes here.

File details

Details for the file nnutils_pytorch-0.3.0-cp37-cp37m-manylinux1_x86_64.whl.

File metadata

  • Download URL: nnutils_pytorch-0.3.0-cp37-cp37m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 2.7 MB
  • Tags: CPython 3.7m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.12.1 pkginfo/1.4.2 requests/2.9.1 setuptools/20.7.0 requests-toolbelt/0.8.0 tqdm/4.19.5 CPython/3.5.2

File hashes

Hashes for nnutils_pytorch-0.3.0-cp37-cp37m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 45114195d69b32a8cdeaa7e06d5b25188a948fee38b39f3979678867a5e173fc
MD5 8b0f932ac17da7bbd7c96a33f126b731
BLAKE2b-256 025306580735e06d3c4f13bbf5fb5ba31c3ebb400acfb84296e871ffd4e61284

See more details on using hashes here.

File details

Details for the file nnutils_pytorch-0.3.0-cp36-cp36m-manylinux1_x86_64.whl.

File metadata

  • Download URL: nnutils_pytorch-0.3.0-cp36-cp36m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 2.7 MB
  • Tags: CPython 3.6m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.12.1 pkginfo/1.4.2 requests/2.9.1 setuptools/20.7.0 requests-toolbelt/0.8.0 tqdm/4.19.5 CPython/3.5.2

File hashes

Hashes for nnutils_pytorch-0.3.0-cp36-cp36m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 d3eedd465afd3ac73add5d31f6b98c0c5eaae69f0ba8a9a6705a30756c7a5759
MD5 d569aca02bd1d0e47917517a3157dacb
BLAKE2b-256 9fb1ca5ff3db2f8b2e479cf3f9eb2b374a8dfe513d4a7199a640e6013170e060

See more details on using hashes here.

File details

Details for the file nnutils_pytorch-0.3.0-cp35-cp35m-manylinux1_x86_64.whl.

File metadata

  • Download URL: nnutils_pytorch-0.3.0-cp35-cp35m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 2.7 MB
  • Tags: CPython 3.5m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.12.1 pkginfo/1.4.2 requests/2.9.1 setuptools/20.7.0 requests-toolbelt/0.8.0 tqdm/4.19.5 CPython/3.5.2

File hashes

Hashes for nnutils_pytorch-0.3.0-cp35-cp35m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 b2cf8a9ea3824df33906264dec4cb2d58d642fbf23046aabe6e0d0260ffa7ff0
MD5 49661cae594db8d1ef69bbb98221ebce
BLAKE2b-256 88eaabb744407283f524b103456cfb4bb9b589e9ad8f7a9d6fb8df1662eec462

See more details on using hashes here.

File details

Details for the file nnutils_pytorch-0.3.0-cp27-cp27mu-manylinux1_x86_64.whl.

File metadata

  • Download URL: nnutils_pytorch-0.3.0-cp27-cp27mu-manylinux1_x86_64.whl
  • Upload date:
  • Size: 2.7 MB
  • Tags: CPython 2.7mu
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.12.1 pkginfo/1.4.2 requests/2.9.1 setuptools/20.7.0 requests-toolbelt/0.8.0 tqdm/4.19.5 CPython/3.5.2

File hashes

Hashes for nnutils_pytorch-0.3.0-cp27-cp27mu-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 ab3b5550d3f7fe7642cb14d06bf78134ea642e3bd46abe83fa603a131c9eb2a8
MD5 cc2f9b4770e02202d25f645c12913e07
BLAKE2b-256 081c86f4ea7ab114eb51684dce37b8a00e3a8b45a23a0912da11814c1b5e2216

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page