Skip to main content

ND Convolution for PyTorch

Project description

torchnd

N-dimensional convolution for PyTorch that works with any number of spatial dimensions, supporting groups, dilation, transposed convolution, complex numbers, and arbitrary dimension layouts.

Installation

pip install torchnd

Usage

2D convolution:

import torch
from torchnd import conv_nd

x = torch.randn(2, 4, 16, 16)
weight = torch.randn(8, 4, 3, 3)
out = conv_nd(x, weight, dim=(-2, -1), padding=1, stride=2)

4D convolution (batch × time × height × width):

x = torch.randn(2, 4, 10, 16, 16)
weight = torch.randn(8, 4, 3, 3, 3)
out = conv_nd(x, weight, dim=(-3, -2, -1), padding=1)

Channel-last layout:

x = torch.randn(2, 16, 16, 4)
weight = torch.randn(8, 4, 3, 3)
out = conv_nd(x, weight, dim=(1, 2), channel_dim=-1, padding=1)

Transposed convolution:

x = torch.randn(2, 4, 8, 8)
weight = torch.randn(4, 8, 3, 3)
out = conv_nd(x, weight, dim=(-2, -1), padding=1, stride=2, transposed=True)

Complex numbers:

x = torch.randn(2, 4, 16, 16, dtype=torch.complex64)
weight = torch.randn(8, 4, 3, 3, dtype=torch.complex64)
out = conv_nd(x, weight, dim=(-2, -1), padding=1)

Asymmetric parameters per dimension:

x = torch.randn(2, 4, 16, 16)
weight = torch.randn(8, 4, 3, 3)
out = conv_nd(
    x, weight,
    dim=(-2, -1),
    stride=(2, 1),
    padding=(1, 2),
    dilation=(1, 2)
)

Modules:

from torchnd import ConvNd, ConvTransposeNd

conv = ConvNd(4, 8, 3, dim=(-2, -1), padding=1)
out = conv(x)

Padding:

from torchnd import pad_nd, adjoint_pad_nd

# Pad or crop arbitrary dimensions
padded = pad_nd(x, pad=(1, 1, 2, 2), dims=(-2, -1), mode="reflect")
cropped = pad_nd(x, pad=(-1, -1), dims=(0,))  # Negative values crop

# Adjoint of padding (unpads)
unpadded = adjoint_pad_nd(padded, pad=(1, 1, 2, 2), dims=(-2, -1))

Adjoint operators:

from torchnd import ConvNd

conv = ConvNd(4, 8, 3, dim=(-2, -1), padding=1, bias=False)
x = torch.randn(2, 4, 16, 16)
y = torch.randn_like(conv(x))

# Adjoint satisfies: <conv(x), y> = <x, conv.adjoint(y)>
# For ConvNd: adjoint is transposed convolution with conjugated weights
adj_y = conv.adjoint(y, input_shape=(16, 16))

Functions

pad_nd

N-dimensional padding and cropping with flexible dimension specification. Supports arbitrary dimension layouts, not limited to the last N dimensions. Negative padding values crop the tensor.

Modes: constant, reflect, replicate, circular. Each dimension can use a different mode. For constant mode, negative padding crops symmetrically. For non-constant modes, negative padding is not supported.

The dims parameter specifies which dimensions to pad. If None, pads the last N dimensions where N = len(pad) // 2. Padding is specified as pairs (left, right) per dimension.

adjoint_pad_nd

Computes the adjoint of pad_nd via autograd. For a padding operator P, the adjoint P* satisfies the inner product identity: <Px, y> = <x, P*y> for all x, y. The adjoint unpads the tensor, summing contributions from padded regions back into the original shape. For constant/zeros mode, the adjoint is equivalent to cropping. For other modes, it correctly handles the adjoint of the boundary extension.

Adjoint convolution

The ConvNd and ConvTransposeNd modules provide adjoint() methods. For ConvNd, the adjoint is the transposed convolution with conjugated weights (for complex) or identical weights (for real). For ConvTransposeNd, the adjoint is the forward convolution with conjugated weights.

The adjoint satisfies the inner product identity: <Ax, y> = <x, A*y> where A is the convolution operator and A* is its adjoint. This is verified via dot product tests. The adjoint is undefined when bias is present.

Implementation

For dimensions beyond 3D, conv_nd recursively decomposes the convolution into lower-dimensional operations. A 4D convolution becomes a sum of 3D convolutions applied to strided slices of the input.

Complex convolution is handled by decomposing into real operations: (a+bi)*(c+di) = (ac-bd) + (ad+bc)i.

The implementation uses native PyTorch operations when possible (1D, 2D, 3D) and falls back to recursion only for higher dimensions, minimizing memory overhead with strided views.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

torchnd-0.1.0.tar.gz (20.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

torchnd-0.1.0-py3-none-any.whl (11.4 kB view details)

Uploaded Python 3

File details

Details for the file torchnd-0.1.0.tar.gz.

File metadata

  • Download URL: torchnd-0.1.0.tar.gz
  • Upload date:
  • Size: 20.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for torchnd-0.1.0.tar.gz
Algorithm Hash digest
SHA256 03c3f87644c5b263c14a2552b096b0e4aba5c7276359c2132cc6e36a2cb185d9
MD5 267360e33edf8ca3d6d62897e7723d97
BLAKE2b-256 13e1b115f0dc9c2f2b6e808623487bc5e9fa38a9c4408d937168546f8d63e697

See more details on using hashes here.

Provenance

The following attestation bundles were made for torchnd-0.1.0.tar.gz:

Publisher: release.yml on fzimmermann89/torchnd

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file torchnd-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: torchnd-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 11.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for torchnd-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 7352b203c86f819b44c19250553983200b5317004341d88122a1c3881a997c73
MD5 d69bce335ee441c494a190e0f4c5d53e
BLAKE2b-256 b22e3d22596d59344572137d04dd3f12dbbfe2a0098d3dbe5ac92f8ef111eca8

See more details on using hashes here.

Provenance

The following attestation bundles were made for torchnd-0.1.0-py3-none-any.whl:

Publisher: release.yml on fzimmermann89/torchnd

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page