Skip to main content

Memory-efficient optimum einsum using opt_einsum planning and PyTorch kernels.

Project description

opt-einsum-torch

There have been many implementations of Einstein's summation. numpy's numpy.einsum is the least efficient one as it only runs in single thread on CPU. PyTorch's torch.einsum works for both CPU and CUDA tensors. However, since there is no virtual CUDA memory, torch.einsum will run out of CUDA memory for large tensors.

This code aims at implementing a memory-efficient einsum function using PyTorch as the backend. This code also uses the opt_einsum package to optimizes the contraction path to achieve the minimal FLOPS.

Usage

from opt_einsum_torch import EinsumPlanner
import torch

# Some huge tensors
arr1, arr2 = ..., ...
ee = EinsumPlanner(torch.device('cuda:0'), cuda_mem_limit=0.9)
result = ee.einsum('ijk,jkl->il', arr1, arr2)

The resulting tensor result will be a PyTorch CPU tensor. You could convert it into numpy array by simply calling result.numpy().

Future works

  • Support multiple GPUs.
  • Memory efficient einsum kernels.
  • CUDA data transfer profilers.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

opt-einsum-torch-0.1.0.tar.gz (10.0 kB view details)

Uploaded Source

Built Distribution

opt_einsum_torch-0.1.0-py3-none-any.whl (10.2 kB view details)

Uploaded Python 3

File details

Details for the file opt-einsum-torch-0.1.0.tar.gz.

File metadata

  • Download URL: opt-einsum-torch-0.1.0.tar.gz
  • Upload date:
  • Size: 10.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.7.1 importlib_metadata/4.10.0 pkginfo/1.8.2 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.8.12

File hashes

Hashes for opt-einsum-torch-0.1.0.tar.gz
Algorithm Hash digest
SHA256 feeae21ea0ff6427c3095c3361e9444b0df70ebaaf0c985e47cf728fe4129c45
MD5 df61d1dddb9b3960c571e9131b73a910
BLAKE2b-256 dbaef092ce0cb43ad33099a1a5ecce1acb3b7a8d480ccf7af9c618da691696a6

See more details on using hashes here.

File details

Details for the file opt_einsum_torch-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: opt_einsum_torch-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 10.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.7.1 importlib_metadata/4.10.0 pkginfo/1.8.2 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.8.12

File hashes

Hashes for opt_einsum_torch-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 1f784f9f9b4df402ea08a0e2037787b9c44809bd5bd3b9f366d3ae78f553990f
MD5 14dd91e2232b246e1df9764e67c631fb
BLAKE2b-256 5cd7bf9a3d615cf0ed11b24eec4ce6f31bc4f1717c07818174c3eaaf951b272c

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page