Skip to main content

A storage solution for PyTorch tensors with distributed tensor support

Project description

TorchStore

A storage solution for PyTorch tensors with distributed tensor support.

TorchStore provides a distributed, asynchronous tensor storage system built on top of Monarch actors. It enables efficient storage and retrieval of PyTorch tensors across multiple processes and nodes with support for various transport mechanisms including RDMA when available.

Key Features:

  • Distributed tensor storage with configurable storage strategies
  • Asynchronous put/get operations for tensors and arbitrary objects
  • Support for PyTorch state_dict serialization/deserialization
  • Multiple transport backends (RDMA, regular TCP) for optimal performance
  • Flexible storage volume management and sharding strategies

Note: Although this may change in the future, TorchStore only supports multi-processing/multi-node jobs launched with Monarch. For more information on what Monarch is, see https://github.com/meta-pytorch/monarch?tab=readme-ov-file#monarch-

⚠️ Early Development Warning TorchStore is currently in an experimental stage. You should expect bugs, incomplete features, and APIs that may change in future versions. The project welcomes bugfixes, but to make sure things are well coordinated you should discuss any significant change before starting the work. It's recommended that you signal your intention to contribute in the issue tracker, either by filing a new issue or by claiming an existing one.

Installation

Env Setup

conda create -n torchstore python=3.12
pip install torch

git clone git@github.com:meta-pytorch/monarch.git
python monarch/scripts/install_nightly.py

git clone git@github.com:meta-pytorch/torchstore.git
cd torchstore
pip install -e .

Development Installation

To install the package in development mode:

# Clone the repository
git clone https://github.com/your-username/torchstore.git
cd torchstore

# Install in development mode
pip install -e .

# Install development dependencies
pip install -e '.[dev]'

# NOTE: It's common to run into libpytorch issues. A good workaround is to export:
# export LD_LIBRARY_PATH="$CONDA_PREFIX/lib:${LD_LIBRARY_PATH:-}"

Regular Installation

To install the package directly from the repository:

pip install git+https://github.com/your-username/torchstore.git

Once installed, you can import it in your Python code:

import torchstore

Usage

import asyncio

import torch

from monarch.actor import Actor, current_rank, endpoint

import torchstore as ts
from torchstore.utils import spawn_actors


WORLD_SIZE = 4


# In monarch, Actors are the way we represent multi-process/node applications. For additional details, see:
# https://github.com/meta-pytorch/monarch?tab=readme-ov-file#monarch-
class ExampleActor(Actor):
    def __init__(self, world_size=WORLD_SIZE):
        self.rank = current_rank().rank
        self.world_size = WORLD_SIZE

    @endpoint
    async def store_tensor(self):
        t = torch.tensor([self.rank])
        await ts.put(f"{self.rank}_tensor", t)

    @endpoint
    async def print_tensor(self):
        other_rank = (self.rank + 1) % self.world_size
        t = await ts.get(f"{other_rank}_tensor")
        print(f"Rank=[{self.rank}] Fetched {t} from {other_rank=}")


async def main():

    # Create a store instance
    await ts.initialize()

    actors = await spawn_actors(WORLD_SIZE, ExampleActor, "example_actors")

    # Calls "store_tensor" on each actor instance
    await actors.store_tensor.call()
    await actors.print_tensor.call()

if  __name__ == "__main__":
    asyncio.run(main())

# Expected output
# [0] [2] Rank=[2] Fetched tensor([3]) from other_rank=3
# [0] [0] Rank=[0] Fetched tensor([1]) from other_rank=1
# [0] [3] Rank=[3] Fetched tensor([0]) from other_rank=0
# [0] [1] Rank=[1] Fetched tensor([2]) from other_rank=2

Resharding Support with DTensor

TorchStore makes it easy to fetch arbitraty slices of any Distributed Tensor. For a full DTensor example, see examples/dtensor.py

class DTensorActor(Actor):
    """
    Example pseudo-code for an Actor utilizing DTensor support

    Full actor definition in [examples/dtensor.py](https://github.com/meta-pytorch/torchstore/blob/main/example/dtensor.py)
    """

    @endpoint
    async def do_put(self):
        # Typical dtensor boiler-plate
        self.initialize_distributed()
        device_mesh = init_device_mesh("cpu", self.mesh_shape)
        tensor = self.original_tensor.to("cpu")
        dtensor = distribute_tensor(tensor, device_mesh, placements=self.placements)

        print(f"Calling put with {dtensor=}")
        # This will place only the local shard into TorchStore
        await ts.put(self.shared_key, dtensor)

    @endpoint
    async def do_get(self):
        # Typical dtensor boiler-plate
        self.initialize_distributed()
        device_mesh = init_device_mesh("cpu", self.mesh_shape)
        tensor = self.original_tensor.to("cpu")
        dtensor = distribute_tensor(tensor, device_mesh, placements=self.placements)

        # Torchstore will use the metadata in the local dtensor to only fetch tensor data
        # which belongs to the local shard.
        fetched_tensor = await ts.get(self.shared_key, dtensor)
        print(fetched_tensor)

# checkout out tests/test_resharding.py for more e2e examples with resharding DTensor.

Testing

Pytest is used for testing. For an examples of how to run tests (and get logs), see: `TORCHSTORE_LOG_LEVEL=DEBUG pytest -vs --log-cli-level=DEBUG tests/test_models.py::test_basic

License

Torchstore is BSD-3 licensed, as found in the LICENSE file.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

torchstore-0.0.1rc3.tar.gz (40.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

torchstore-0.0.1rc3-py3-none-any.whl (31.5 kB view details)

Uploaded Python 3

File details

Details for the file torchstore-0.0.1rc3.tar.gz.

File metadata

  • Download URL: torchstore-0.0.1rc3.tar.gz
  • Upload date:
  • Size: 40.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.19

File hashes

Hashes for torchstore-0.0.1rc3.tar.gz
Algorithm Hash digest
SHA256 92bd66cbe42d0da391f17960a4dba6ecce6d753b3786004a86f795457b8735eb
MD5 c5a64e9c31220c2d684efa02e242b3a0
BLAKE2b-256 731eb737e8c2e7155e349fbd918371054baec2f852e42cba13503b43ec5dcb69

See more details on using hashes here.

File details

Details for the file torchstore-0.0.1rc3-py3-none-any.whl.

File metadata

  • Download URL: torchstore-0.0.1rc3-py3-none-any.whl
  • Upload date:
  • Size: 31.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.19

File hashes

Hashes for torchstore-0.0.1rc3-py3-none-any.whl
Algorithm Hash digest
SHA256 1da2a33e0dbcbbc0a93b25a9cf63af52ee09b3fd64dbc263fd28504ab8bf12a3
MD5 b5efd2157fc6767c30a4adbfd2390cbf
BLAKE2b-256 f01b5e9dee266fbb0fee7afed17f8498d5e93bb8cb1028a4ac830079e9632035

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page