Skip to main content

Runhouse: A multiplayer cloud compute and data environment

Project description

🏃‍♀️Runhouse🏠

Discord Twitter Website Docs Den

👵 Welcome Home!

Runhouse gives your code the superpower of traversing remote infrastructure, so you can iterate and debug your ML apps and workflows locally in regular Python (no DSLs, yaml, or prescriptive dev environment) with full-scale compute and data (no sandbox). It's the fastest way to build, run, and deploy production-quality ML apps and workflows on your own infrastructure, and perhaps the only way to take production code and run it as-is locally (again, running on identical powerful infra) to iterate it further or debug.

After you've sent a function or class to remote compute, Runhouse also allows you to persist, reuse, and share it as a service, turning otherwise redundant AI activities into common modular components across your team or company. This improves cost, velocity, and reproducibility - think 10 ML pipelines and researchers calling the same shared preprocessing, training, evaluation, or batch inference service, rather than each allocating their own compute resources and deploying slightly differing code. Or, imagine experimenting with a new preprocessing method in a notebook, but you can call every other stage of your ML workflow as the production services themselves.

Highlights:

  • 👩‍🔬 Dispatch Python functions, classes, and data to remote infra instantly, and call them eagerly as if they were local. Logs are streamed, iteration is fast.
  • 👷‍♀️ Share Python functions or classes as robust services, including HTTPS, auth, observability, scaling, custom domains, secrets, versioning, and more.
  • 🐍 No DSL, decorators, yaml, CLI incantations, or boilerplate. Just your own regular Python.
  • 🚀 Deploy anywhere you run Python. No special packaging or deployment process. Research and production code are identical.
  • 👩‍🎓 BYO-infra with extensive and growing support - Ray, Kubernetes, AWS, GCP, Azure, local, on-prem, and more. When you want to shift or scale, just send your code to more powerful infra.
  • 👩‍🚀 Extreme reproducibility and portability. A single succinct script can allocate the infra, set up dependencies, and serve your app.
  • 👩‍🍳 Nest applications to create complex workflows and services. Components are decoupled so you can change, shift, or scale any component without affecting the rest of your system.

The Runhouse API is dead simple. Send your modules (functions and classes) into environments on compute infra, like this:

import runhouse as rh
from diffusers import StableDiffusionPipeline

def sd_generate(prompt, **inference_kwargs):
    model = StableDiffusionPipeline.from_pretrained("stabilityai/stable-diffusion-2-base").to("cuda")
    return model(prompt, **inference_kwargs).images

if __name__ == "__main__":
    gpu = rh.cluster(name="rh-a10x", instance_type="A10G:1", provider="aws")
    sd_env = rh.env(reqs=["torch", "transformers", "diffusers"], name="sd_generate", working_dir="./")

    # Deploy the function and environment (syncing over local code changes and installing dependencies)
    remote_sd_generate = rh.function(sd_generate).to(gpu, env=sd_env)

    # This call is actually an HTTP request to the app running on the remote server
    imgs = remote_sd_generate("A hot dog made out of matcha.")
    imgs[0].show()

    # You can also call it over HTTP directly, e.g. from other machines or languages
    print(remote_sd_generate.endpoint())

With the above simple structure you can build, call, and share:

  • 🛠️ AI primitives: Preprocessing, training, fine-tuning, evaluation, inference
  • 🚀 Higher-order services: Multi-step inference, e2e workflows, evaluation gauntlets, HPO
  • 🧪 UAT endpoints: Instant endpoints for client teams to test and integrate
  • 🦺 Best-practice utilities: PII obfuscation, content moderation, data augmentation

🛋️ Sharing and Versioning with Runhouse Den

You can unlock unique accessibility and sharing features with Runhouse Den, a complimentary product to this repo. Log in from anywhere to save, share, and load resources:

runhouse login

or from Python:

import runhouse as rh
rh.login()

Extending the example above to share and load our app via Den:

remote_sd_generate.share(["my_pal@email.com"])

# The service stub can now be reloaded from anywhere, always at yours and your collaborators' fingertips
# Notice this code doesn't need to change if you update, move, or scale the service
remote_sd_generate = rh.function("/your_username/sd_generate")
imgs = remote_sd_generate("More matcha hotdogs.")
imgs[0].show()

🏗️ Supported Compute Infra

Please reach out (first name at run.house) if you don't see your favorite compute here.

  • Local - Supported
  • Single box - Supported
  • Ray cluster - Supported
  • Kubernetes - Supported
  • Amazon Web Services (AWS)
    • EC2 - Supported
    • EKS - Supported
    • SageMaker - Supported
    • Lambda - Alpha
  • Google Cloud Platform (GCP)
    • GCE - Supported
    • GKE - Supported
  • Microsoft Azure
    • VMs - Supported
    • AKS - Supported
  • Lambda Labs - Supported
  • Modal Labs - Planned
  • Slurm - Exploratory

👨‍🏫 Learn More

🐣 Getting Started: Installation, setup, and a quick walkthrough.

📖 Docs: Detailed API references, basic API examples and walkthroughs, end-to-end tutorials, and high-level architecture overview.

👩‍💻 Blog: Deep dives into Runhouse features, use cases, and the future of AI infra.

👾 Discord: Join our community to ask questions, share ideas, and get help.

𝑋 Twitter: Follow us for updates and announcements.

🙋‍♂️ Getting Help

Message us on Discord, email us (first name at run.house), or create an issue.

👷‍♀️ Contributing

We welcome contributions! Please check out contributing.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

runhouse-0.0.29.tar.gz (305.8 kB view details)

Uploaded Source

Built Distribution

runhouse-0.0.29-py3-none-any.whl (368.5 kB view details)

Uploaded Python 3

File details

Details for the file runhouse-0.0.29.tar.gz.

File metadata

  • Download URL: runhouse-0.0.29.tar.gz
  • Upload date:
  • Size: 305.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.13

File hashes

Hashes for runhouse-0.0.29.tar.gz
Algorithm Hash digest
SHA256 31fd3fea6ab538489b2107d2a3fc32299a1285fad8d374c0d2494654f57af306
MD5 c2b66c52e594d666a8fd0e961b162c76
BLAKE2b-256 7d9cc2671dc86142827952f6de2987afccb9dca73a0b0381e1af94845003610b

See more details on using hashes here.

File details

Details for the file runhouse-0.0.29-py3-none-any.whl.

File metadata

  • Download URL: runhouse-0.0.29-py3-none-any.whl
  • Upload date:
  • Size: 368.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.13

File hashes

Hashes for runhouse-0.0.29-py3-none-any.whl
Algorithm Hash digest
SHA256 5ff0f52824876e9f73b8fd1047ab401bd47eb79f57c3d862ae70ea502db00e28
MD5 6819459b78e842f937e8bf15bd3894dd
BLAKE2b-256 6cfa0ca72bf1d57e32583b7511996ca38e8628c05a090281a177283ff344f648

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page