Skip to main content

Global-Wise Pooling Layers for PyTorch

Project description

Non-Local Pooling

Update 2.0.1

  • Now NonLocalPool2d will not use PixelShuffle by default since it may harm the performance. NonLocalPool3d still uses PixelShuffle3d to reduce the token number.

Update 1.2.0

  • Now you can determine the output channel just like all the other learnable pooling methods (former version force out_channel=in_channel). However, if you specify out_channel, MaxPool would not work then since they cannot be added together. Leave out_channel to None to make the module works like before.

To use NonLocalPooling for your PyTorch project:

Step 1

pip install nonlocalpooling

Step 2

from nonlocalpooling.pool import NonLocalPool2d, NonLocalPool3d

Non-Local Pooling can be used to substitue your original PyTorch pooling methods.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nonlocalpooling-2.0.1.tar.gz (4.1 kB view hashes)

Uploaded Source

Built Distribution

nonlocalpooling-2.0.1-py3-none-any.whl (4.1 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page