Skip to main content

Einstein notation function which allows for simple, readible, and highly flexible tensor operations.

Project description

Einfunc

PyPI - Version PyPI - Python Version


A convenient way of applying functions to tensors. einfunc is incredibly similar to einsum, but with one key difference. The ability to apply your own custom function instead of multiplication. einfunc also allows for the ability to choose how reductions occur within the operation.

einfunc is a simple interface, which utilizes Pytorch's torchdim. I highly recommend checking out torchdim as it is by far the best way to do readable tensor operations in pytorch. einfunc is just a convenient way of tapping into torchdim with a function similar to einsum.

Table of Contents

Installation

einfunc requires torch >= 2.0 and python >= 3.11.

pip install einfunc

API

Using einfunc is similar to einsum however you also pass a function and a mode of reduction. Take this math equation for example.

$$ \frac{1}{K} \sum_j^K \left(\log (z_j) / \prod_i^B x_i^2 - e^{y_{i,j}} \right)$$

We can use einfunc to represent this math equation with 2 lines of code.

inner_exp = einfunc(x, y, 'b, b k -> k', lambda a, b : a ** 2 - torch.exp(b), reduce='prod')
final_exp = einfunc(z, inner_exp, 'k, k -> ', lambda a, b : torch.log(a) / b, reduce='mean')

Creating functions

While lambda functions are simple, any function can work as long as it takes the correct number of inputs. For example when looking at the following expression:

inner_exp = einfunc(x, y, 'i, i j -> j', lambda a, b : a ** 2 - torch.exp(b), reduce='prod')

x maps to a and y maps to b. This means that the order of the function variables is passed in the order that tensors are passed to einfunc.

Reducing

Currently, einfunc supports 5 different types of reduction.

  • Mean

$$ \frac{1}{I} \sum_i^I x_{i,j} - y_{k, i} $$

result = einfunc(x, y, 'i j, k i -> j k', lambda a, b : a - b, reduce='mean')
  • Sum

$$ \sum_i^I x_{i,j} - y_{k, i} $$

result = einfunc(x, y, 'i j, k i -> j k', lambda a, b : a - b, reduce='sum')
  • Prod

$$ \prod_i^I x_{i,j} - y_{k, i} $$

result = einfunc(x, y, 'i j, k i -> j k', lambda a, b : a - b, reduce='prod')
  • Max

$$ Max(x_{i,j} - y_{k, i}, \quad dim=i) $$

result = einfunc(x, y, 'i j, k i -> j k', lambda a, b : a - b, reduce='max')
  • Min

$$ Min(x_{i,j} - y_{k, i}, \quad dim=i) $$

result = einfunc(x, y, 'i j, k i -> j k', lambda a, b : a - b, reduce='min')

One thing to note is that if reduce is not passed then 'sum' is assumed by einfunc.

Why you shouldn't use einfunc

Einfunc is just a convenient way of interfacing with PyTorch and Torchdim. This creates some overhead when operating on tensors, compared to vanilla operations and torchdim operations. This means that it will be much faster to use vanilla pytorch operations if doing a simple operation, or just use torchdim if trying to do something more complex.

Why you should use einfunc

It's convenient and slightly more readable than torchdim IMO. Understanding exactly what is happening in an operation can be hard, and einfunc makes it a lot simpler by boiling operations down to a single expression, while using einstein notation to indicate indexing.

Additional Examples

Coming Soon :)

Planned Work

Currently, einfunc does support parenthesis and ellipses. I will be working on implementing this as soon as I can.

Acknowledgements

Check out einops and torchdim.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

einfunc-0.0.4.tar.gz (6.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

einfunc-0.0.4-py3-none-any.whl (5.5 kB view details)

Uploaded Python 3

File details

Details for the file einfunc-0.0.4.tar.gz.

File metadata

  • Download URL: einfunc-0.0.4.tar.gz
  • Upload date:
  • Size: 6.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: python-httpx/0.27.0

File hashes

Hashes for einfunc-0.0.4.tar.gz
Algorithm Hash digest
SHA256 19177c56f8ef7caf84a95ec2984191eb6cbe4ad02e380414f764c6048eb13af8
MD5 956056f30332f9e5e1d959c5da21dbed
BLAKE2b-256 60c35a66516db12626eec1b90e8c5642cc85ad2903837f38632c9ac2645e37ee

See more details on using hashes here.

File details

Details for the file einfunc-0.0.4-py3-none-any.whl.

File metadata

  • Download URL: einfunc-0.0.4-py3-none-any.whl
  • Upload date:
  • Size: 5.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: python-httpx/0.27.0

File hashes

Hashes for einfunc-0.0.4-py3-none-any.whl
Algorithm Hash digest
SHA256 d0d0ae11e4f886b70ebbe7170ad80242b562ab7de1b2ced676bd9ebbc168213f
MD5 17f30c22760e7682342d2a18bd2cd373
BLAKE2b-256 49b2114bc816813f5aa909393c3b1b018f6f461b9b795aab9680bccf8124893c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page