Skip to main content

TensorDict is a pytorch dedicated tensor container.

Project description

Docs Discord Python version GitHub license pypi version Downloads Conda (channel only)

TensorDict

TensorDict is a dictionary-like class that inherits properties from tensors, such as indexing, shape operations, casting to device or storage and many more. The code-base consists of two main components: TensorDict, a specialized dictionary for PyTorch tensors, and tensorclass, a dataclass for tensors.

from tensordict import TensorDict

data = TensorDict(
    obs=torch.randn(128, 84),
    action=torch.randn(128, 4),
    reward=torch.randn(128, 1),
    batch_size=[128],
)

data_gpu = data.to("cuda")      # all tensors move together
sub = data_gpu[:64]              # all tensors are sliced
stacked = torch.stack([data, data])  # works like a tensor

Key Features | Examples | Installation | Ecosystem | Citation | License

Key Features

TensorDict makes your code-bases more readable, compact, modular and fast. It abstracts away tailored operations, dispatching them on the leaves for you.

  • Composability: TensorDict generalizes torch.Tensor operations to collections of tensors. [tutorial]
  • Speed: asynchronous transfer to device, fast node-to-node communication through consolidate, compatible with torch.compile. [tutorial]
  • Shape operations: indexing, slicing, concatenation, reshaping -- everything you can do with a tensor. [tutorial]
  • Distributed / multiprocessed: distribute TensorDict instances across workers, devices and machines. [doc]
  • Serialization and memory-mapping for efficient checkpointing. [doc]
  • Functional programming and compatibility with torch.vmap. [tutorial]
  • Nesting: nest TensorDict instances to create hierarchical structures. [tutorial]
  • Lazy preallocation: preallocate memory without initializing tensors. [tutorial]
  • @tensorclass: a specialized dataclass for torch.Tensor. [tutorial]

Examples

Check our Getting Started guide for a full overview of TensorDict's features.

Before / after

Working with groups of tensors is common in ML. Without a shared structure, every operation must be repeated for each tensor:

# Without TensorDict
obs = obs.to("cuda")
action = action.to("cuda")
reward = reward.to("cuda")
next_obs = next_obs.to("cuda")

obs_batch = obs[:32]
action_batch = action[:32]
reward_batch = reward[:32]
next_obs_batch = next_obs[:32]

With TensorDict, all of that collapses to:

# With TensorDict
data = data.to("cuda")
data_batch = data[:32]

This holds for any operation: reshape, unsqueeze, permute, to, indexing, torch.stack, torch.cat, and many more.

Generic training loops

Using TensorDict primitives, most supervised training loops can be rewritten in a generic way:

for i, data in enumerate(dataset):
    data = model(data)
    loss = loss_module(data)
    loss.backward()
    optimizer.step()
    optimizer.zero_grad()

Each step of the training loop -- data loading, model prediction, loss computation -- can be swapped independently without touching the rest. The same loop works across classification, segmentation, RL, and more.

Fast copy on device

By default, device transfers are asynchronous and synchronized only when needed:

td_cuda = TensorDict(**dict_of_tensors, device="cuda")
td_cpu = td_cuda.to("cpu")
td_cpu = td_cuda.to("cpu", non_blocking=False)  # force synchronous

Coding an optimizer

Using TensorDict you can code the Adam optimizer as you would for a single tensor and apply it to a collection of parameters. On CUDA, these operations use fused kernels:

class Adam:
    def __init__(self, weights: TensorDict, alpha: float=1e-3,
                 beta1: float=0.9, beta2: float=0.999,
                 eps: float = 1e-6):
        weights = weights.lock_()
        self.weights = weights
        self.t = 0

        self._mu = weights.data.clone()
        self._sigma = weights.data.mul(0.0)
        self.beta1 = beta1
        self.beta2 = beta2
        self.alpha = alpha
        self.eps = eps

    def step(self):
        self._mu.mul_(self.beta1).add_(self.weights.grad, 1 - self.beta1)
        self._sigma.mul_(self.beta2).add_(self.weights.grad.pow(2), 1 - self.beta2)
        self.t += 1
        mu = self._mu.div_(1-self.beta1**self.t)
        sigma = self._sigma.div_(1 - self.beta2 ** self.t)
        self.weights.data.add_(mu.div_(sigma.sqrt_().add_(self.eps)).mul_(-self.alpha))

Ecosystem

TensorDict is used across a range of domains:

Domain Projects
Reinforcement Learning TorchRL (PyTorch), DreamerV3-torch, Dreamer4, SkyRL
LLM Post-Training verl, ROLL (Alibaba), LMFlow, LoongFlow (Baidu)
Robotics & Simulation MuJoCo Playground (Google DeepMind), ProtoMotions (NVIDIA), holosoma (Amazon)
Physics & Scientific ML PhysicsNeMo (NVIDIA)
Genomics Medaka (Oxford Nanopore)

Installation

With pip:

pip install tensordict

For the latest features:

pip install tensordict-nightly

With conda:

conda install -c conda-forge tensordict

With uv + PyTorch nightlies:

If you're using a PyTorch nightly, install tensordict with --no-deps to prevent uv from re-resolving torch from PyPI:

uv pip install -e . --no-deps

Or explicitly point uv at the PyTorch nightly wheel index:

uv pip install -e . --prerelease=allow -f "https://download.pytorch.org/whl/nightly/cpu/torch_nightly.html"

Citation

If you're using TensorDict, please refer to this BibTeX entry to cite this work:

@misc{bou2023torchrl,
      title={TorchRL: A data-driven decision-making library for PyTorch},
      author={Albert Bou and Matteo Bettini and Sebastian Dittert and Vikash Kumar and Shagun Sodhani and Xiaomeng Yang and Gianni De Fabritiis and Vincent Moens},
      year={2023},
      eprint={2306.00577},
      archivePrefix={arXiv},
      primaryClass={cs.LG}
}

License

TensorDict is licensed under the MIT License. See LICENSE for details.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

tensordict_nightly-2026.4.18-cp314-cp314-macosx_15_0_universal2.whl (520.2 kB view details)

Uploaded CPython 3.14macOS 15.0+ universal2 (ARM64, x86-64)

tensordict_nightly-2026.4.18-cp313-cp313-macosx_15_0_universal2.whl (520.0 kB view details)

Uploaded CPython 3.13macOS 15.0+ universal2 (ARM64, x86-64)

tensordict_nightly-2026.4.18-cp312-cp312-macosx_15_0_universal2.whl (519.9 kB view details)

Uploaded CPython 3.12macOS 15.0+ universal2 (ARM64, x86-64)

tensordict_nightly-2026.4.18-cp311-cp311-macosx_15_0_universal2.whl (519.1 kB view details)

Uploaded CPython 3.11macOS 15.0+ universal2 (ARM64, x86-64)

tensordict_nightly-2026.4.18-cp310-cp310-macosx_15_0_universal2.whl (517.8 kB view details)

Uploaded CPython 3.10macOS 15.0+ universal2 (ARM64, x86-64)

File details

Details for the file tensordict_nightly-2026.4.18-cp314-cp314-manylinux1_x86_64.whl.

File metadata

File hashes

Hashes for tensordict_nightly-2026.4.18-cp314-cp314-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 06e18f9045916311548f3ace44fa4f1a1a1e9c42175b9fe05cfc928fe5dc4bdc
MD5 570aef1e1a1dc280a3a3bf5b3b34dd22
BLAKE2b-256 bd46409e5993b208029393c047a1211a337492dcbe70bb12d0acd44efda1f2ad

See more details on using hashes here.

File details

Details for the file tensordict_nightly-2026.4.18-cp314-cp314-macosx_15_0_universal2.whl.

File metadata

File hashes

Hashes for tensordict_nightly-2026.4.18-cp314-cp314-macosx_15_0_universal2.whl
Algorithm Hash digest
SHA256 af0cb7d5dd704d8b205d6ccbfcc600180f48c55d66ff0326471282938369d58e
MD5 54c92827cb68ef87371e43315028427e
BLAKE2b-256 e3310ad335f9b8e5f8b8bde509616cea037225b1a54a7eaffc783f1fbe6b3b97

See more details on using hashes here.

File details

Details for the file tensordict_nightly-2026.4.18-cp313-cp313-manylinux1_x86_64.whl.

File metadata

File hashes

Hashes for tensordict_nightly-2026.4.18-cp313-cp313-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 44271a4d5db46feab8d3fb3aed18a400f6c79618b613dd2719325e0f5de78dd5
MD5 2a187706ca2ef13108f7f712ddd01bf3
BLAKE2b-256 31f1a12ca0900cdf4b06302bffe7a07b1213264f02b9a5423e6ab5757dcf32b9

See more details on using hashes here.

File details

Details for the file tensordict_nightly-2026.4.18-cp313-cp313-macosx_15_0_universal2.whl.

File metadata

File hashes

Hashes for tensordict_nightly-2026.4.18-cp313-cp313-macosx_15_0_universal2.whl
Algorithm Hash digest
SHA256 13b159b4de560a8589e589b004cd32ffada88c427b66bb19ac487cf19b079dc2
MD5 6e0b075160552023367dac5e94a97008
BLAKE2b-256 372814d7624f1c071271908d6f4780b78c4617fb26df107349ceb342f168dacc

See more details on using hashes here.

File details

Details for the file tensordict_nightly-2026.4.18-cp312-cp312-manylinux1_x86_64.whl.

File metadata

File hashes

Hashes for tensordict_nightly-2026.4.18-cp312-cp312-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 674775fe42580dfd24dabf66ae1f72de415d0e8a1c8a4815b4b6a5cb26f1b2f0
MD5 c14ca49fc0bcf5650570fbc65eac1c8d
BLAKE2b-256 73744c1c53d91f8ee16c2ec57a17af5281b275cea35d31b0342753225a13c14e

See more details on using hashes here.

File details

Details for the file tensordict_nightly-2026.4.18-cp312-cp312-macosx_15_0_universal2.whl.

File metadata

File hashes

Hashes for tensordict_nightly-2026.4.18-cp312-cp312-macosx_15_0_universal2.whl
Algorithm Hash digest
SHA256 d152bb1a2d799b719191dd1acd6407fba463e56353b2026fb449af6d65f21181
MD5 15db30261bce43d9f51ee6cc6f168370
BLAKE2b-256 0ff141b41bea87dbf881b0fadf7e65b972f5f54bcfc3e60064b00857ad5d97a9

See more details on using hashes here.

File details

Details for the file tensordict_nightly-2026.4.18-cp311-cp311-manylinux1_x86_64.whl.

File metadata

File hashes

Hashes for tensordict_nightly-2026.4.18-cp311-cp311-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 47e13df19f7f35f7b249f3d3cf06ad9d5b02d5ea82088bf0c830f199286584e2
MD5 e8d4704479ffcda16720d0ed893d56ad
BLAKE2b-256 7e4193c239d83d73a428f2c86275b746b3bc6cf2b5ad27922be82904fe6b0f89

See more details on using hashes here.

File details

Details for the file tensordict_nightly-2026.4.18-cp311-cp311-macosx_15_0_universal2.whl.

File metadata

File hashes

Hashes for tensordict_nightly-2026.4.18-cp311-cp311-macosx_15_0_universal2.whl
Algorithm Hash digest
SHA256 6be18636d692a6e070cab541ac62068d6cc48bef48eed4a89ed30ba62981396c
MD5 516cedf3542e929515fbabb2018d096a
BLAKE2b-256 0908137336060244737d0fd12f757adce1b2753f94c613187919bfe49780f1b1

See more details on using hashes here.

File details

Details for the file tensordict_nightly-2026.4.18-cp310-cp310-manylinux1_x86_64.whl.

File metadata

File hashes

Hashes for tensordict_nightly-2026.4.18-cp310-cp310-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 17739a170163503d06a0ab231c1c80694b3a62eece90f7eab079796364276575
MD5 82a910a264c39bcd4b43e26d9ff750e2
BLAKE2b-256 be8d022c332bec83a61e7c5741d0d7768b052ff70bc43165b43a8dc8a0f8c90e

See more details on using hashes here.

File details

Details for the file tensordict_nightly-2026.4.18-cp310-cp310-macosx_15_0_universal2.whl.

File metadata

File hashes

Hashes for tensordict_nightly-2026.4.18-cp310-cp310-macosx_15_0_universal2.whl
Algorithm Hash digest
SHA256 085c24731eba1e6a926f47261b10a3eb0d5dd5c86f22de9b882a7bc633a31534
MD5 8e0fc34f1e7dc7b6b705ca2823723d76
BLAKE2b-256 eb9b2c8e642e175c7002747cc5846ed0a734dc2921254d71ad87f01e9dc70052

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page