Skip to main content

A torchrun decorator for Metaflow

Project description

Metaflow torchrun decorator

Introduction

This repository implements a plugin to run parallel Metaflow tasks as nodes in a torchrun job which can be submitted to AWS Batch or a Kubernetes cluster.

Features

  • Automatic torchrun integration: This extension provides a simple and intuitive way to incorporate PyTorch distributed programs in your Metaflow workflows using the @torchrun decorator
  • No changes to model code: The @torchrun decorator exposes a new method on the Metaflow current object, so you can run your existing torch distributed programs inside Metaflow tasks with no changes in the research code.
  • Run one command: You don't need to log into many nodes and run commands on each. Instead, the @torchrun decorator will select arguments for the torchrun command based on the requests in Metaflow compute decorators like number of GPUs. Network addresses are automatically discoverable.
  • No user-facing subprocess calls: At the end of the day, @torchrun is calling a subprocess inside a Metaflow task. Although many Metaflow users do this, it can make code difficult to read for beginners. One major goal of this plugin is to motivate hardening and automating a pattern for submitting subprocess calls inside Metaflow tasks.

Installation

You can install it with:

pip install metaflow-torchrun

Getting Started

And then you can import it and use in parallel steps:

from metaflow import FlowSpec, step, torchrun

...
class MyGPT(FlowSpec):

    @step
    def start(self):
        self.next(self.torch_multinode, num_parallel=N_NODES)

    @kubernetes(cpu=N_CPU, gpu=N_GPU, memory=MEMORY)
    @torchrun
    @step
    def torch_multinode(self):
        ...
        current.torch.run(
            entrypoint="main.py", # No changes made to original script.
            entrypoint_args = {"main-arg-1": "123", "main-arg-2": "777"},
            nproc_per_node=1,     # edge case of a torchrun arg user-facing.
        )
        ...
    ...

Examples

Directory torch script description
Hello Each process prints their rank and the world size.
Tensor pass Main process passes a tensor to the workers.
Torch DDP A flow that uses a script from the torchrun tutorials on multi-node DDP.
MinGPT A flow that runs a torchrun GPT demo that simplifies Karpathy's minGPT in a set of parallel Metaflow tasks each contributing their @resources.

License

metaflow-torchrun is distributed under the Apache License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

metaflow-torchrun-0.1.2.tar.gz (12.7 kB view details)

Uploaded Source

Built Distribution

metaflow_torchrun-0.1.2-py2.py3-none-any.whl (14.9 kB view details)

Uploaded Python 2Python 3

File details

Details for the file metaflow-torchrun-0.1.2.tar.gz.

File metadata

  • Download URL: metaflow-torchrun-0.1.2.tar.gz
  • Upload date:
  • Size: 12.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for metaflow-torchrun-0.1.2.tar.gz
Algorithm Hash digest
SHA256 42237233a8d21c3ce418e82dfa1d23f5ed40099385cb338fd40d518ef227cb89
MD5 ad1a6b3fd8bc297c21c13c5ccdcab228
BLAKE2b-256 eb52e4c41c4a6b2688ce3da7fc77960ac88a2067dd7e46dd40ad687f4c9803eb

See more details on using hashes here.

File details

Details for the file metaflow_torchrun-0.1.2-py2.py3-none-any.whl.

File metadata

File hashes

Hashes for metaflow_torchrun-0.1.2-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 38d779e2a7481efef413a0844e6e90b82da85ff10e9c9de63da9275fba6c494b
MD5 5cde0630b567e5c93517bd6538bd0b00
BLAKE2b-256 f5222fc515961150d424880d3a3cb514d6016c7dfa03ce0a7dd5154b0237d219

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page