Skip to main content

Tensor learning in Python.

Project description

https://badge.fury.io/py/tensorly.svg https://anaconda.org/tensorly/tensorly/badges/version.svg https://travis-ci.org/tensorly/tensorly.svg?branch=master https://coveralls.io/repos/github/tensorly/tensorly/badge.svg?branch=master https://badges.gitter.im/tensorly/tensorly.svg

TensorLy

TensorLy is a Python library that aims at making tensor learning simple and accessible. It allows to easily perform tensor decomposition, tensor learning and tensor algebra. Its backend system allows to seamlessly perform computation with NumPy, MXNet, PyTorch, TensorFlow or CuPy, and run methods at scale on CPU or GPU.


Installing TensorLy

The only pre-requisite is to have Python 3 installed. The easiest way is via the Anaconda distribution.

With pip (recommended)

With conda

pip install -U tensorly
conda install -c tensorly tensorly

Development (from git)

# clone the repository
git clone https://github.com/tensorly/tensorly
cd tensorly
# Install in editable mode with `-e` or, equivalently, `--editable`
pip install -e .

Note: TensorLy depends on NumPy by default. If you want to use the MXNet or PyTorch backends, you will need to install these packages separately.

For detailed instruction, please see the documentation.


Running the tests

Testing and documentation are an essential part of this package and all functions come with uni-tests and documentation.

The tests are ran using the pytest package (though you can also use nose). First install pytest:

pip install pytest

Then to run the test, simply run, in the terminal:

pytest -v tensorly

Alternatively, you can specify for which backend you wish to run the tests:

TENSORLY_BACKEND='numpy' pytest -v tensorly

Quickstart

Create a small third order tensor of size 3 x 4 x 2 and perform simple operations on it:

import tensorly as tl
import numpy as np


tensor = tl.tensor(np.arange(24).reshape((3, 4, 2)), dtype=tl.float64)
unfolded = tl.unfold(tensor, mode=0)
tl.fold(unfolded, mode=0, shape=tensor.shape)

Applying tensor decomposition is easy:

from tensorly.decomposition import tucker
# Apply Tucker decomposition
tucker_tensor = tucker(tensor, rank=[2, 2, 2])
# Reconstruct the full tensor from the decomposed form
tl.tucker_to_tensor(tucker_tensor)

You can change the backend to perform computation with a different framework. Note that using MXNet, PyTorch, TensorFlow or CuPy requires to have installed them first. For instance, after setting the backend to PyTorch, all the computation is done by PyTorch, and tensors can be created on GPU:

tl.set_backend('pytorch') # Or 'mxnet', 'numpy', 'tensorflow' or 'cupy'
tensor = tl.tensor(np.arange(24).reshape((3, 4, 2)), device='cuda:0')
type(tensor) # torch.Tensor

For more information on getting started, checkout the user-guide and for a detailed reference of the functions and their documentation, refer to the API

If you see a bug, open an issue, or better yet, a pull-request!


Citing

If you use TensorLy in an academic paper, please cite [1]:

@article{tensorly,
  author  = {Jean Kossaifi and Yannis Panagakis and Anima Anandkumar and Maja Pantic},
  title   = {TensorLy: Tensor Learning in Python},
  journal = {Journal of Machine Learning Research},
  year    = {2019},
  volume  = {20},
  number  = {26},
  pages   = {1-6},
  url     = {http://jmlr.org/papers/v20/18-277.html}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tensorly-musco-0.4.5.tar.gz (69.9 kB view details)

Uploaded Source

File details

Details for the file tensorly-musco-0.4.5.tar.gz.

File metadata

  • Download URL: tensorly-musco-0.4.5.tar.gz
  • Upload date:
  • Size: 69.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/2.0.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.4.0 requests-toolbelt/0.9.1 tqdm/4.36.1 CPython/3.7.4

File hashes

Hashes for tensorly-musco-0.4.5.tar.gz
Algorithm Hash digest
SHA256 8ea00cc52c12fbed045acb688776180e6009566a419f2e978e2898a30a483339
MD5 8dec4c9921ac2ac8fe62b9cda3475bc1
BLAKE2b-256 4bf3ab89e214be45a6299631c87b697df8581e8bc2d229b0425cfd801dfa3dd5

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page