Skip to main content

Symmetric Positive Definite (SPD) enforcement layers for PyTorch

Project description

spdlayers

Symmetric Positive Definite (SPD) enforcement layers for PyTorch.

Regardless of the input, the output of these layers will always be a SPD tensor!

Installation

Install with pip

python -m pip install spdlayers

About

The Cholesky layer uses a cholesky factorization to enforce SPD, and the Eigen layer uses an eigendecomposition to enforce SPD.

Both layers take in some tensor of shape [batch_size, input_shape] and output a SPD tensor of shape [batch_size, output_shape, output_shape]. The relationship between input and output is defined by the following.

input_shape = sum([i for i in range(output_shape + 1)])

The layers have no learnable parameters, and merely serve to transform a vector space to a SPD matrix space.

The initialization options for each layer are:

Args:
    output_shape (int): The dimension of square tensor to produce,
        default output_shape=6 results in a 6x6 tensor
    symmetry (str): 'anisotropic' or 'orthotropic'. Anisotropic can be
        used to predict for any shape tensor, while 'orthotropic' is a
        special case of symmetry for a 6x6 tensor.
    positive (str): The function to perform the positive
        transformation of the diagonal of the lower triangle tensor.
        Choices are 'Abs' (default), 'Square', 'Softplus', 'ReLU',
        'ReLU6', '4', and 'Exp'.
    min_value (float): The minimum allowable value for a diagonal
        component. Default is 1e-8.

Examples

This is the simplest neural network using 1 hidden layer of size 100. There are 2 input features to the model (n_features = 2), and model outputs a 6 x 6 spd tensor.

Using the Cholesky factorization as the SPD layer:

import torch.nn as nn
import spdlayers

hidden_size = 100
n_features = 2
out_shape = 6
in_shape = spdlayers.in_shape_from(out_shape)

model = nn.Sequential(
          nn.Linear(n_features, hidden_size),
          nn.Linear(hidden_size, in_shape),
          spdlayers.Cholesky(output_shape=out_shape)
        )

Or with the eigendecomposition as the SPD layer:

import torch.nn as nn
import spdlayers

hidden_size = 100
n_features = 2
out_shape = 6
in_shape = spdlayers.in_shape_from(out_shape)

model = nn.Sequential(
          nn.Linear(n_features, hidden_size),
          nn.Linear(hidden_size, in_shape),
          spdlayers.Eigen(output_shape=out_shape)
        )

examples/train_sequential_model.py trains this model on the orthotropic stiffness trensor from the 2D Isotruss.

API

The API has the following import structure.

spdlayers
    ├── Cholesky
    ├── Eigen
    ├── in_shape_from
    ├── layers
    │   ├── Cholesky
    │   ├── Eigen
    ├── tools.py
    │   ├── in_shape_from

Documentation

You can use pdoc to build API documentation, or view the online documentation.

pdoc3 --html spdlayers

Requirements

For basic usage:

python>=3.6
torch>=1.9.0

Additional dependencies for testing:

pytest
pytest-cov
numpy

Changelog

Changes are documented in CHANGELOG.md

Citation

If you find this work useful, please cite our upcoming paper.

License

see LICENSE and NOTICE

SPDX-License-Identifier: MIT

LLNL-CODE-829369

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

spdlayers-0.0.3.tar.gz (5.5 kB view details)

Uploaded Source

File details

Details for the file spdlayers-0.0.3.tar.gz.

File metadata

  • Download URL: spdlayers-0.0.3.tar.gz
  • Upload date:
  • Size: 5.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.6.0 importlib_metadata/4.8.1 pkginfo/1.8.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.7.11

File hashes

Hashes for spdlayers-0.0.3.tar.gz
Algorithm Hash digest
SHA256 25a8e57873474e473f706cf7a930d213c8f4f18cf04eef20fd8fa5125945463b
MD5 ee3d0555e2a84692d0b47c3fdcb7fb8b
BLAKE2b-256 1f75f0832b01d892b7db778ff0de9c5a3ade132c54c6bf3076b49994333d4a4e

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page