Skip to main content

Runhouse: A multiplayer cloud compute and data environment

Project description

🏃‍♀️ Runhouse 🏠

🚨 Caution: This is an Alpha 🚨

Runhouse is heavily under development. We are sharing it with a few select people to collect feedback, and expect to iterate on the APIs considerably before reaching beta (version 0.1.0).

👵 Welcome Home!

PyTorch lets you send a Python function or tensor .to(device), so why can't you do my_fn.to('a_gcp_a100') or my_table.to('parquet_in_s3')? Runhouse allows just that: send code and data to any of your compute or data infra (with your own cloud creds), all in Python, and continue to use them eagerly exactly as they were.

Runhouse is for ML Researchers, Engineers, and Data Scientists who are tired of:

  • 🚜 manually shuttling code and data around between their local machine, remote instances, and cloud storage,
  • 📤📥 constantly spinning up and down boxes,
  • 🐜 debugging over ssh and notebook tunnels,
  • 🧑‍🔧 translating their code into a pipeline DSL just to use multiple hardware types,
  • 🪦 debugging in an orchestrator,
  • 👩‍✈️ missing out on fancy LLM IDE features,
  • 🕵️ and struggling to find their teammates' code and data artifacts.

By way of a visual,

img.png img_1.png

Take a look at this code (adapted from our first tutorial):

import runhouse as rh
from diffusers import StableDiffusionPipeline
import torch

def sd_generate(prompt, num_images=1, steps=100, guidance_scale=7.5, model_id='stabilityai/stable-diffusion-2-base'):
    pipe = StableDiffusionPipeline.from_pretrained(model_id, torch_dtype=torch.float16, revision='fp16').to('cuda')
    return pipe([prompt] * num_images, num_inference_steps=steps, guidance_scale=guidance_scale).images

if __name__ == "__main__":
    gpu = rh.cluster(name='rh-v100', instance_type='V100:1', provider='gcp')
    generate_gpu = rh.send(fn=sd_generate).to(gpu, reqs=['./', 'torch==1.12.0', 'diffusers'])

    images = generate_gpu('A digital illustration of a woman running on the roof of a house.', num_images=2, steps=50)
    [image.show() for image in images]

There's no magic yaml, DSL, code serialization, or "submitting for execution." We're just spinning up the cluster for you (or using an existing cluster), syncing over your code, starting a gRPC connection, and running your code on the cluster.

Runhouse does things for you that you'd spend time doing yourself, in as obvious a way as possible.

And because it's not stateless, we can pin the model to GPU memory, and get ~1.5s/image inference before any compilation.

On the data side, we can do things like:

# Send a folder up to a cluster (rsync)
rh.folder(url=input_images_dir).to(fs=gpu, url='dreambooth/instance_images')

# Stream a table in from anywhere (S3, GCS, local, etc)
preprocessed_yelp = rh.table(name="preprocessed-tokenized-dataset")
for batch in preprocessed_table.stream(batch_size=batch_size):
    ...

# Send a model checkpoint up to blob storage
trained_model = rh.blob(data=pickle.dumps(model))
trained_model.to('s3', url='runhouse/my_bucket').save(name='yelp_fine_tuned_bert')

These APIs work from anywhere with a Python interpreter and an internet connection, so notebooks, scripts, pipeline DSLs, etc. are all fair game. We currently support AWS, GCP, Azure, and Lambda Labs credentials through SkyPilot, as well as BYO cluster (just drop in an ip address and ssh key).

🐣 Getting Started

tldr;

pip install runhouse
# Or "runhouse[aws]", "runhouse[gcp]", "runhouse[azure]", "runhouse[all]"
sky check
# Optionally, for portability (e.g. Colab):
runhouse login

🔌 Installation

⚠️ On Apple M1 or M2 machines ⚠️, you will need to install grpcio with conda before you install Runhouse - more specifically, before you install Ray. If you already have Ray installed, you can skip this. See here for how to install grpc properly on Apple silicon. You'll only know if you did this correctly if you run ray.init() in a Python interpreter. If you're having trouble with this, let us know.

Runhouse can be installed with:

pip install runhouse

Depending on which cloud providers you plan to use, you can also install the following optional dependencies (to install the right versions of tools like boto, gsutil, etc.):

pip install "runhouse[aws]"
pip install "runhouse[gcp]"
pip install "runhouse[azure]"
@ Or
pip install "runhouse[all]"

As this is an alpha, we push feature updates every few weeks as new microversions.

✈️ Verifying your Cloud Setup with SkyPilot

Runhouse supports both BYO cluster, where you interact with existing compute via their IP address and SSH key, and autoscaled clusters, where we spin up and down cloud instances in your own cloud account for you. If you only plan to use BYO clusters, you can disregard the following.

Runhouse uses SkyPilot for much of the heavy lifting with launching and terminating cloud instances. We love it and you should throw them a Github star ⭐️.

To verify that your cloud credentials are set up correctly for autoscaling, run

sky check

in your command line. This will confirm which cloud providers are ready to use, and will give detailed instructions if any setup is incomplete. SkyPilot also provides an excellent suite of CLI commands for basic instance management operations. There are a few that you'll be reaching for frequently when using Runhouse with autoscaling that you should familiarize yourself with, here.

🔒 Creating a Runhouse Account for Secrets and Portability

Using Runhouse with only the OSS Python package is perfectly fine. However, you can unlock some unique portability features by creating an (always free) account on api.run.house and saving your secrets and/or resource metadata there. For example, you can open a Google Colab, call runhouse login, and all of your secrets or resources will be ready to use there with no additional setup. Think of the OSS-package-only experience as akin to Microsoft Office, while creating an account will make your cloud resources sharable and accessible from anywhere like Google Docs. You can see examples of this portability in the Runhouse Tutorials.

To create an account, visit api.run.house, or simply call runhouse login from the command line (or rh.login() from Python).

Note These portability features only ever store light metadata about your resources (e.g. my_folder_name -> [provider, bucket, path]) on our API servers. All the actual data and compute stays inside your own cloud account and never hits our servers. The Secrets service stores your secrets in Hashicorp Vault (an industry standard for secrets management), and our secrets APIs simply call Vault's APIs. We never store secrets on our API servers. We plan to add support for BYO secrets management shortly. Let us know if you need it and which system you use.

👨‍🏫 Tutorials / API Walkthrough / Docs

Can be found here. We're planning to do a docs sprint in late February, but for now, our tutorials have been structured to provide a comprehensive walkthrough of the APIs.

🙋‍♂️ Getting Help

Please join our discord server here to message us, or email us (donny at run.house or josh at run.house), or create an issue.

👷‍♀️ Contributing

We welcome contributions! Please contact us if you're interested.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

runhouse-0.0.1.2.tar.gz (90.2 kB view details)

Uploaded Source

Built Distribution

runhouse-0.0.1.2-py3-none-any.whl (106.8 kB view details)

Uploaded Python 3

File details

Details for the file runhouse-0.0.1.2.tar.gz.

File metadata

  • Download URL: runhouse-0.0.1.2.tar.gz
  • Upload date:
  • Size: 90.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.15

File hashes

Hashes for runhouse-0.0.1.2.tar.gz
Algorithm Hash digest
SHA256 d387f793fed0edbea4262e849f98d62fda731625480a2216005f9bcc68a89e04
MD5 879d65856bdb6b279f4d9312092b5bfa
BLAKE2b-256 bb48431748b5dacdb4483be053db496fa4b3822607bea36cc01bcf9c448d96ef

See more details on using hashes here.

File details

Details for the file runhouse-0.0.1.2-py3-none-any.whl.

File metadata

  • Download URL: runhouse-0.0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 106.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.15

File hashes

Hashes for runhouse-0.0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 28f436cbd642fc0b2db1d4ab47e343378d35b6321f22fd38402ec47b6d818252
MD5 cca6b3295c1d8b0301495f50ed397477
BLAKE2b-256 0df1c8f1e9937b30210c2852d8fdbce2b13f579ff28caed68eb0f434512a2aff

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page