Skip to main content

A torchrun decorator for Metaflow

Project description

Metaflow torchrun decorator

Introduction

This repository implements a plugin to run parallel Metaflow tasks as nodes in a torchrun job which can be submitted to AWS Batch or a Kubernetes cluster.

Features

  • Automatic torchrun integration: This extension provides a simple and intuitive way to incorporate PyTorch distributed programs in your Metaflow workflows using the @torchrun decorator
  • No changes to model code: The @torchrun decorator exposes a new method on the Metaflow current object, so you can run your existing torch distributed programs inside Metaflow tasks with no changes in the research code.
  • Run one command: You don't need to log into many nodes and run commands on each. Instead, the @torchrun decorator will select arguments for the torchrun command based on the requests in Metaflow compute decorators like number of GPUs. Network addresses are automatically discoverable.
  • No user-facing subprocess calls: At the end of the day, @torchrun is calling a subprocess inside a Metaflow task. Although many Metaflow users do this, it can make code difficult to read for beginners. One major goal of this plugin is to motivate hardening and automating a pattern for submitting subprocess calls inside Metaflow tasks.

Installation

You can install it with:

pip install metaflow-torchrun

Getting Started

And then you can import it and use in parallel steps:

from metaflow import FlowSpec, step, torchrun

...
class MyGPT(FlowSpec):

    @step
    def start(self):
        self.next(self.torch_multinode, num_parallel=N_NODES)

    @kubernetes(cpu=N_CPU, gpu=N_GPU, memory=MEMORY)
    @torchrun
    @step
    def torch_multinode(self):
        ...
        current.torch.run(
            entrypoint="main.py", # No changes made to original script.
            entrypoint_args = {"main-arg-1": "123", "main-arg-2": "777"},
            nproc_per_node=1,     # edge case of a torchrun arg user-facing.
        )
        ...
    ...

Examples

Directory torch script description
Hello Each process prints their rank and the world size.
Tensor pass Main process passes a tensor to the workers.
Torch DDP A flow that uses a script from the torchrun tutorials on multi-node DDP.
MinGPT A flow that runs a torchrun GPT demo that simplifies Karpathy's minGPT in a set of parallel Metaflow tasks each contributing their @resources.

License

metaflow-torchrun is distributed under the Apache License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

metaflow_torchrun-0.2.1.tar.gz (12.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

metaflow_torchrun-0.2.1-py3-none-any.whl (14.9 kB view details)

Uploaded Python 3

File details

Details for the file metaflow_torchrun-0.2.1.tar.gz.

File metadata

  • Download URL: metaflow_torchrun-0.2.1.tar.gz
  • Upload date:
  • Size: 12.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for metaflow_torchrun-0.2.1.tar.gz
Algorithm Hash digest
SHA256 10e924f82dd6bcfb87ef7d1e3dfd9ccde343a425fc4e91fbbbfe6378f35fb67d
MD5 ecb3255137ba84cb38357d0001be1694
BLAKE2b-256 7849204b328a818865511da6b11010bc55aa8f00ef5d2b2131a362e37bf06161

See more details on using hashes here.

File details

Details for the file metaflow_torchrun-0.2.1-py3-none-any.whl.

File metadata

File hashes

Hashes for metaflow_torchrun-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 47e73eb9b863a469213b464f6bde4e9dcba29e42b0c00c3b04833c925245fea0
MD5 80e3f1049cdd27ed6167d3a901b91a53
BLAKE2b-256 b9e392d0213247afdee229027dd8f771ed28c58f27160b707fa256e42cb8410d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page