Skip to main content

An Efficient Unitary Neural Network implementation for PyTorch

Project description

torch_eunn

This repository contains a simple PyTorch implementation of a Tunable Efficient Unitary Neural Network (EUNN) Cell.

The implementation is loosely based on the tunable EUNN presented in this paper: https://arxiv.org/abs/1612.05231.

Installation

    pip install torch_eunn

Usage

    from torch_eunn import EUNN # feed forward layer
    from torch_eunn import EURNN # recurrent unit

Note

The hidden_size and the capacity of the EUNN need to be even, as explained in the section "Difference with original implementation".

Examples

Requirements

  • PyTorch >= 0.4.0: conda install pytorch -c pytorch

Difference with original implementation

This implementation of the EUNN has a major difference with the original implementation proposed in https://arxiv.org/abs/1612.05231, which is outlined below.

In the original implementation, the first output of the top directional coupler of a capacity-2 sublayer skips the second layer of directional couplers (indicated with dots in the ascii figure below) to connect to the next capacity-2 sublayer of the EUNN. The reverse happens at the bottom, where the first layer of the capacity-2 sublayer is skipped. This way, a (2*n+1)-dimensional unitary matrix representation is created, with n the number of mixing units in each capacity-1 sublayer.

  __  __......
    \/
  __/\____  __
          \/
  __  ____/\__
    \/
  __/\____  __
          \/
  ......__/\__

For each capacity-1 sublayer with N=2*n+1 inputs (N odd), we thus have N-1 parameters (each mixing unit has 2 parameters). Thus to have a unitary matrix representation that spans the full unitary space, one needs N capacity-1 layers and N extra phases appended to the back of the capacity-N sublayer to bring the total number of parameters in the unitary-matrix representation to N**2 (the total number of independent parameters in a unitary matrix).

In the implementation proposed here, the dots in each capacity-2 sublayer are connected onto themselves (periodic boundaries). This has the implication that for each capacity-1 sublayer with n directional couplers, there are N=2*n inputs and as many independent parameters. This means that we just need N capacity-1 sublayers and no extra phases to span the full unitary space with N**2 parameters.

This, however, has the implication that the hidden_size = N = 2*n of the unitary matrix should always be even. Also, because the forward pass is defined per capacity-2 sublayer (as opposed per capacity-1 sublayer in the original implementation) the capacity has to be even as well.

License

© Floris Laporte, 2018-2019.

Made available under the MIT license.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

torch_eunn-0.2.1.tar.gz (5.8 kB view details)

Uploaded Source

Built Distribution

torch_eunn-0.2.1-py3-none-any.whl (7.0 kB view details)

Uploaded Python 3

File details

Details for the file torch_eunn-0.2.1.tar.gz.

File metadata

  • Download URL: torch_eunn-0.2.1.tar.gz
  • Upload date:
  • Size: 5.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/2.0.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/46.4.0.post20200518 requests-toolbelt/0.9.1 tqdm/4.46.0 CPython/3.7.7

File hashes

Hashes for torch_eunn-0.2.1.tar.gz
Algorithm Hash digest
SHA256 9f7222040c0eb7f292396970e4f3fc6ed4292209f7df5f8fcba7f83293ea5cbe
MD5 2f61bfacb51d7290b5c2b5910d8f9ae7
BLAKE2b-256 b94b2ff753c6cbb5eea7c284a73fafe35b7e5a8325d1a4c7e7d632d7a24195de

See more details on using hashes here.

File details

Details for the file torch_eunn-0.2.1-py3-none-any.whl.

File metadata

  • Download URL: torch_eunn-0.2.1-py3-none-any.whl
  • Upload date:
  • Size: 7.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/2.0.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/46.4.0.post20200518 requests-toolbelt/0.9.1 tqdm/4.46.0 CPython/3.7.7

File hashes

Hashes for torch_eunn-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 953d1d2d6f3a2bf2718b2cb6b8fdab9ed010d80d1a488033f3dd6658cb4f6bfa
MD5 ad493bac6b57b0e19cbbcb0b8b35855f
BLAKE2b-256 155cbfab8177d2f22416f894e4bed67241c7ffe910f82991e15190a4fbf886fc

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page