Skip to main content

PyTorch bindings of the nnutils library

Project description

nnutils-pytorch

Build Status

PyTorch bindings of different neural network-related utilities implemented for CPUs and GPUs (CUDA).

So far, most of the utils are related to my need of working with images of different sizes grouped into batches with padding.

Included functions

Adaptive pooling

Adaptive pooling layers included in several packages like Torch or PyTorch assume that all images in the batch have the same size. This implementation takes into account the size of each individual image within the batch (before padding) to apply the adaptive pooling.

Currently implemented: Average and maximum adaptive pooling.

import torch
from nnutils_pytorch import adaptive_avgpool_2d, adaptive_maxpool_2d

# Two random images, with three channels, 10 pixels height, 12 pixels width
x = torch.rand(2, 3, 10, 12)
# Matrix (N x 2) containing the height and width of each image.
xs = torch.tensor([[10, 6], [6, 12], dtype=torch.int64)

# Pool images to a fixed size, taking into account the original size of each
# image before padding.
#
# Output tensor has shape (2, 3, 3, 5)
y1 = adaptive_avgpool_2d(batch_input=x, output_sizes=(3, 5), batch_sizes=xs)

# Pool a single dimension of the images, taking into account the original
# size of each image before padding. The None dimension is not pooled.
#
# Output tensor has shape (2, 3, 5, 12)
y2 = adaptive_maxpool_2d(x, (5, None), xs)

Important: The implementation assumes that the images are aligned to the top-left corner.

Masking images by size

If you are grouping images of different sizes into batches padded with zeros, you may need to mask the output/input tensors after/before some layers. This layer is very handy (and efficient) in these cases.

import torch
from nnutils_pytorch import mask_image_from_size

# Two random images, with three channels, 10 pixels height, 12 pixels width
x = torch.rand(2, 3, 10, 12)
# Matrix (N x 2) containing the height and width of each image.
xs = torch.tensor([[10, 6], [6, 12], dtype=torch.int64)

# Note: mask_image_from_size is differentiable w.r.t. x
y = mask_image_from_size(x, xs, mask_value=0)  # mask_value is optional.

Important: The implementation assumes that the images are aligned to the top-left corner.

Requirements

  • Python: 2.7, 3.5, 3.6 or 3.7 (tested with version 2.7, 3.5, 3.6 and 3.7).
  • PyTorch (tested with version 0.4.1).
  • C++11 compiler (tested with GCC 4.8.2, 5.5.0, 6.4.0).
  • For GPU support: CUDA Toolkit.

Installation

The installation process should be pretty straightforward assuming that you have correctly installed the required libraries and tools.

The setup process compiles the package from source, and will compile with CUDA support if this is available for PyTorch.

From Pypi (recommended)

pip install nnutils-pytorch

You may find the package already compiled for different Python, CUDA and CPU configurations in: http://www.jpuigcerver.net/projects/nnutils-pytorch/whl/

For instance, if you want to install the CPU-only version for Python 3.7:

pip install http://www.jpuigcerver.net/projects/nnutils-pytorch/whl/cpu/nnutils_pytorch-0.2.1-cp37-cp37m-linux_x86_64.whl

From GitHub

git clone https://github.com/jpuigcerver/nnutils.git
cd nnutils/pytorch
python setup.py build
python setup.py install

AVX512 related issues

Some compiling problems may arise when using CUDA and newer host compilers with AVX512 instructions. Please, install GCC 4.9 and use it as the host compiler for NVCC. You can simply set the CC and CXX environment variables before the build/install commands:

CC=gcc-4.9 CXX=g++-4.9 pip install nnutils-pytorch

or (if you are using the GitHub source code):

CC=gcc-4.9 CXX=g++-4.9 python setup.py build

Testing

You can test the library once installed using unittest. In particular, run the following commands:

python -m unittest nnutils_pytorch.adaptive_avgpool_2d_test
python -m unittest nnutils_pytorch.adaptive_maxgpool_2d_test
python -m unittest nnutils_pytorch.mask_image_from_size_test

All tests should pass (CUDA tests are only executed if supported).

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nnutils_pytorch-0.2.1.tar.gz (18.6 kB view details)

Uploaded Source

Built Distributions

File details

Details for the file nnutils_pytorch-0.2.1.tar.gz.

File metadata

  • Download URL: nnutils_pytorch-0.2.1.tar.gz
  • Upload date:
  • Size: 18.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.12.1 pkginfo/1.4.2 requests/2.19.1 setuptools/40.2.0 requests-toolbelt/0.8.0 tqdm/4.26.0 CPython/3.5.2

File hashes

Hashes for nnutils_pytorch-0.2.1.tar.gz
Algorithm Hash digest
SHA256 d8c2c1f557dcc7d2071bc8a586d4cbae86d3ab6f0aa92625661d3b7e8697284c
MD5 06d9b39825dd9cfd197873f6be1190b6
BLAKE2b-256 171143861ba48b5cb5e4c35f15ff7559e74332a5e905e9ece07c30212d7d9583

See more details on using hashes here.

File details

Details for the file nnutils_pytorch-0.2.1-cp37-cp37m-manylinux1_x86_64.whl.

File metadata

  • Download URL: nnutils_pytorch-0.2.1-cp37-cp37m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 1.2 MB
  • Tags: CPython 3.7m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.12.1 pkginfo/1.4.2 requests/2.19.1 setuptools/40.2.0 requests-toolbelt/0.8.0 tqdm/4.26.0 CPython/3.5.2

File hashes

Hashes for nnutils_pytorch-0.2.1-cp37-cp37m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 1f35af95941e32b14d4aa0ceec3e5c0a16005e5d9c8851bc1a9e535524279e68
MD5 553c9a545ed54a625c78d8336e83b323
BLAKE2b-256 b0a49d13166126e315653451765ce3d9046d3a7fd1c86c3fc498608f7fb71d7d

See more details on using hashes here.

File details

Details for the file nnutils_pytorch-0.2.1-cp36-cp36m-manylinux1_x86_64.whl.

File metadata

  • Download URL: nnutils_pytorch-0.2.1-cp36-cp36m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 1.2 MB
  • Tags: CPython 3.6m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.12.1 pkginfo/1.4.2 requests/2.19.1 setuptools/40.2.0 requests-toolbelt/0.8.0 tqdm/4.26.0 CPython/3.5.2

File hashes

Hashes for nnutils_pytorch-0.2.1-cp36-cp36m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 1def5b2737f03fc69ba6c12477e1f166b8570c3999331847247d6c9c730f88b8
MD5 5908166c0bbcf1ed9a886bc3d5fd2cbc
BLAKE2b-256 36626b60626713a6f79bd33bd463014454d8c246108a47eb3a6bbd1d94328d50

See more details on using hashes here.

File details

Details for the file nnutils_pytorch-0.2.1-cp35-cp35m-manylinux1_x86_64.whl.

File metadata

  • Download URL: nnutils_pytorch-0.2.1-cp35-cp35m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 1.2 MB
  • Tags: CPython 3.5m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.12.1 pkginfo/1.4.2 requests/2.19.1 setuptools/40.2.0 requests-toolbelt/0.8.0 tqdm/4.26.0 CPython/3.5.2

File hashes

Hashes for nnutils_pytorch-0.2.1-cp35-cp35m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 4a5d9de6b0ae7d4b65615c49c084175cced5ab94241cb7ae99660737c110e5ec
MD5 71116c1917a3e441ac70ae8110f4fe30
BLAKE2b-256 d84774ad8f202886c13030fb3e7e14e0ae399a1499ea20155b49ee7a683f94b6

See more details on using hashes here.

File details

Details for the file nnutils_pytorch-0.2.1-cp27-cp27mu-manylinux1_x86_64.whl.

File metadata

  • Download URL: nnutils_pytorch-0.2.1-cp27-cp27mu-manylinux1_x86_64.whl
  • Upload date:
  • Size: 1.3 MB
  • Tags: CPython 2.7mu
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.12.1 pkginfo/1.4.2 requests/2.19.1 setuptools/40.2.0 requests-toolbelt/0.8.0 tqdm/4.26.0 CPython/3.5.2

File hashes

Hashes for nnutils_pytorch-0.2.1-cp27-cp27mu-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 9273971b70c7f26f37f2771931088dae02725a1456b9e8229efb945e8fa73df5
MD5 adaaa351b14b924c906ed9f6bb0f49c4
BLAKE2b-256 a0487069728702d57e8f3d2961eafa10f90a043f200a2e0324a4dabb6762e333

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page