Global-Wise Pooling Layers for PyTorch
Project description
Non-Local Pooling
Update 2.0.1
- Now NonLocalPool2d will not use PixelShuffle by default since it may harm the performance. NonLocalPool3d still uses PixelShuffle3d to reduce the token number.
Update 1.2.0
- Now you can determine the output channel just like all the other learnable pooling methods (former version force out_channel=in_channel). However, if you specify out_channel, MaxPool would not work then since they cannot be added together. Leave out_channel to None to make the module works like before.
To use NonLocalPooling for your PyTorch project:
Step 1
pip install nonlocalpooling
Step 2
from nonlocalpooling.pool import NonLocalPool2d, NonLocalPool3d
Non-Local Pooling can be used to substitue your original PyTorch pooling methods.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
nonlocalpooling-2.0.1.tar.gz
(4.1 kB
view hashes)
Built Distribution
Close
Hashes for nonlocalpooling-2.0.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 84c9295cd9eee6bab1638bddb63b4e270c6b710574da3c43cd49beb7a840632c |
|
MD5 | 36aa2dc65c7b07f74c43c6b781d8c4bf |
|
BLAKE2b-256 | 9ecaf6d21e189d526d03aa4ff451ff30e495757c448a833251d267107ee32b15 |