Skip to main content

An Efficient Unitary Neural Network implementation for PyTorch

Project description

torch_eunn

This repository contains a simple PyTorch implementation of a Tunable Efficient Unitary Neural Network (EUNN) Cell.

The implementation is loosely based on the tunable EUNN presented in this paper: https://arxiv.org/abs/1612.05231.

Installation

    pip install torch_eunn

Usage

    from torch_eunn import EUNN # feed forward layer
    from torch_eunn import EURNN # recurrent unit

Note

The hidden_size and the capacity of the EUNN need to be even, as explained in the section "Difference with original implementation".

Examples

Requirements

  • PyTorch >= 0.4.0: conda install pytorch -c pytorch

Difference with original implementation

This implementation of the EUNN has a major difference with the original implementation proposed in https://arxiv.org/abs/1612.05231, which is outlined below.

In the original implementation, the first output of the top directional coupler of a capacity-2 sublayer skips the second layer of directional couplers (indicated with dots in the ascii figure below) to connect to the next capacity-2 sublayer of the EUNN. The reverse happens at the bottom, where the first layer of the capacity-2 sublayer is skipped. This way, a (2*n+1)-dimensional unitary matrix representation is created, with n the number of mixing units in each capacity-1 sublayer.

  __  __......
    \/
  __/\____  __
          \/
  __  ____/\__
    \/
  __/\____  __
          \/
  ......__/\__

For each capacity-1 sublayer with N=2*n+1 inputs (N odd), we thus have N-1 parameters (each mixing unit has 2 parameters). Thus to have a unitary matrix representation that spans the full unitary space, one needs N capacity-1 layers and N extra phases appended to the back of the capacity-N sublayer to bring the total number of parameters in the unitary-matrix representation to N**2 (the total number of independent parameters in a unitary matrix).

In the implementation proposed here, the dots in each capacity-2 sublayer are connected onto themselves (periodic boundaries). This has the implication that for each capacity-1 sublayer with n directional couplers, there are N=2*n inputs and as many independent parameters. This means that we just need N capacity-1 sublayers and no extra phases to span the full unitary space with N**2 parameters.

This, however, has the implication that the hidden_size = N = 2*n of the unitary matrix should always be even. Also, because the forward pass is defined per capacity-2 sublayer (as opposed per capacity-1 sublayer in the original implementation) the capacity has to be even as well.

License

© Floris Laporte, 2018-2019.

Made available under the MIT license.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

torch_eunn-0.2.0.tar.gz (5.5 kB view details)

Uploaded Source

Built Distribution

torch_eunn-0.2.0-py3-none-any.whl (6.8 kB view details)

Uploaded Python 3

File details

Details for the file torch_eunn-0.2.0.tar.gz.

File metadata

  • Download URL: torch_eunn-0.2.0.tar.gz
  • Upload date:
  • Size: 5.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.12.1 pkginfo/1.4.2 requests/2.21.0 setuptools/40.6.3 requests-toolbelt/0.8.0 tqdm/4.28.1 CPython/3.7.0

File hashes

Hashes for torch_eunn-0.2.0.tar.gz
Algorithm Hash digest
SHA256 a7a83ee7dbdc0606082d11fd511437c4f7e5b215766597395cdca8a897c8d2e1
MD5 70c86816ab3a3be270d78dd6cf67bae4
BLAKE2b-256 142bf28ab3eea55e0fcf4b6a877ea835f1a893ee7e55b1247c9112df4e97b378

See more details on using hashes here.

File details

Details for the file torch_eunn-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: torch_eunn-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 6.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.12.1 pkginfo/1.4.2 requests/2.21.0 setuptools/40.6.3 requests-toolbelt/0.8.0 tqdm/4.28.1 CPython/3.7.0

File hashes

Hashes for torch_eunn-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 fd7cbd019ddf0fd2db620b6d46f34cd249cad58e41a5174e49e8ec428c6f6f09
MD5 c06e87d07efe2be2b51d89b16fe894eb
BLAKE2b-256 893da3a64957a7e3344b33cb2f1c84cebdf875171ed9e63ec18246eb6a1bc7e9

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page