Useful additional layers for PyTorch.
Project description
Torchmore
The torchmore
library is a small library of layers and utilities
for writing PyTorch models for image recognition, OCR, and other applications.
Flex
The flex
library performs simple size inference. It does so by wrapping up individual layers in a wrapper that instantiates the layer only when dimensional data is available. The wrappers can be removed later and the model turned into one with only completely standard modules. That looks like this:
from torch import nn
from torchmore import layers, flex
noutput = 10
model = nn.Sequential(
layers.Input("BDHW"),
flex.Conv2d(100),
flex.BatchNorm(),
nn.ReLU(),
flex.Conv2d(100),
flex.BatchNorm(),
nn.ReLU(),
layers.Reshape([1, [2, 3, 4]]),
flex.Full(100),
flex.BatchNorm(),
nn.ReLU(),
flex.Full(noutput)
)
flex.shape_inference(model, (1, 1, 28, 28))
The flex
library provides wrappers for the following layers right now:
Linear
Conv1d
,Conv2d
,Conv3d
ConvTranspose1d
,ConvTranspose2d
,ConvTranspose3d
LSTM
,BDL_LSTM
,BDHW_LSTM
BatchNorm1d
,BatchNorm2d
,BatchNorm3d
BatchNorm
You can use Flex
directly. The following two layers are identical:
layer1 = flex.Conv2d(100)
layer2 = flex.Flex(lambda x: nn.Conv2d(x.size(1), 100))
That is, you can easily turn any layer into a Flex
layer that way even if it isn't in the library.
Layers
layers.Input
The Input
layer is a handy little layer that reorders input dimensions, checks size ranges and value ranges, and automatically transfers data to the current device on which the model runs.
For example, consider the following Input
layer:
layers.Input("BHWD", "BDHW", range=(0, 1), sizes=[None, 1, None, None]),
This says:
- the input is in "BHWD" order and will get reordered to "BDHW"
- input values must be in the interval $[0, 1]$
- input tensors must have $D=1$
- input tensors are transferred to the same device as weights for the model
The .order
Attribute
Note that if the input tensor has a .order
attribute, that will be used to reorder the input dimensions into the desired dimensions. This allows the model to accept inputs in multiple orders. Consider
model = nn.Sequential(
layers.Input("BHWD", "BDHW", range=(0, 1), sizes=[None, 1, None, None]),
...
)
a = torch.rand((1, 100, 150, 1))
b = a.permute(0, 3, 1, 2)
b.order = "BDHW"
assert model(a) == model(b)
layers.Reorder
The Reorder
layer reorders axes just like Tensor.permute
does, but it does so in a way that documents better what is going on. Consider the following code fragment:
layers.Reorder("BDL", "LBD"),
flex.LSTM(100, bidirectional=True),
layers.Reorder("LBD", "BDL"),
flex.Conv1d(noutput, 1),
layers.Reorder("BDL", "BLD")
The letters themselves are arbitrary, but common choices are "BDLHW". This is likely clearer than a sequence of permutations.
layers.Fun
For module-based networks, it's convenient to add functions. The Fun
layer permits that, as in:
layers.Fun("lambda x: x.permute(2, 0, 1)")
Note that since functions are specified as strings, this can be pickled.
LSTM layers
layers.LSTM
: a trivial LSTM layer that simply dicards the state outputlayers.BDL_LSTM
: an LSTM variant that is a drop-in replacement for aConv1d
layerlayers.BDHW_LSTM
: an MDLSTM variant that is a drop-in replacement for aConv2d
layerlayers.BDHW_LSTM_to_BDH
: a rowwise LSTM, reducing dimension by 1
Other Layers
These may be occasionally useful:
layers.Info(info="", every=1000000)
: prints info about the activationslayers.CheckSizes(...)
: checks the sizes of tensors propagated throughlayers.CheckRange(...)
: checks the ranges of valueslayers.Permute(...)
: axis permutation (like x.permute)layers.Reshape(...)
: tensor reshaping, with the option of combining axeslayers.View(...)
: equivalent of x.viewlayers.Parallel
: run two modules in parallel and stack the resultslayers.SimplePooling2d
: wrapped up max pooling/unpoolinglayers.AcrossPooling2d
: wrapped up max pooling/unpooling with convolution
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file torchmore-0.1.0.tar.gz
.
File metadata
- Download URL: torchmore-0.1.0.tar.gz
- Upload date:
- Size: 13.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.21.0 setuptools/41.1.0 requests-toolbelt/0.9.1 tqdm/4.33.0 CPython/3.7.5
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | e4f656693dd668747f39919de61be2ecbb849cde2c6a22adfb9530cf65e38654 |
|
MD5 | e4e5818a26b597a4ff582d42c6270b33 |
|
BLAKE2b-256 | a4a7a245952774f7962d131e2ac89dbd34dc4955323c5978602b4b6a037c8d5b |
File details
Details for the file torchmore-0.1.0-py3-none-any.whl
.
File metadata
- Download URL: torchmore-0.1.0-py3-none-any.whl
- Upload date:
- Size: 13.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.21.0 setuptools/41.1.0 requests-toolbelt/0.9.1 tqdm/4.33.0 CPython/3.7.5
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 6987868e92414b0d0b629a5cf46cce3f46a0d16feefe244a063c9d1385c8a3fc |
|
MD5 | 0b2f06db4613aa765002e010140a5623 |
|
BLAKE2b-256 | 9ff94cf8d04eb675bc3b730e053ffdfc821fe70d58316de6763deb9110ff9e9e |