Skip to main content

Runhouse: A multiplayer cloud compute and data environment

Project description

🏃‍♀️Runhouse🏠

Discord Twitter Website Docs Den

👵 Welcome Home!

Runhouse is the fastest way to build, run, and deploy production-quality AI apps and workflows on your own compute. Leverage simple, powerful APIs for the full lifecycle of AI development, through research→evaluation→production→updates→scaling→management, and across any infra.

By automatically packaging your apps into scalable, secure, and observable services, Runhouse can also turn otherwise redundant AI activities into common reusable components across your team or company, which improves cost, velocity, and reproducibility.

Highlights:

  • 👩‍🔬 Dispatch Python functions, classes, and data to remote infra (clusters, cloud VMs, etc.) instantly. No need to reach for a workflow orchestrator to run different chunks of code on various beefy boxes.
  • 👷‍♀️ Deploy Python functions or classes as production-quality services instantly, including HTTPS, auth, observability, scaling, custom domains, secrets, versioning, and more. No research-to-production gap.
  • 🐍 No DSL, decorators, yaml, CLI incantations, or boilerplate. Just your own Python.
  • 👩‍🎓 Extensive support for Ray, Kubernetes, AWS, GCP, Azure, local, on-prem, and more. When you want to shift or scale, just send your app to more powerful infra.
  • 👩‍🚀 Extreme reusability and portability. A single succinct script can stand up your app, dependencies, and infra.
  • 👩‍🍳 Arbitrarily nest applications to create complex workflows and services. Apps are decoupled so you can change, move, or scale any component without affecting the rest of your system.

The Runhouse API is dead simple. Send your apps (functions and classes) into environments on compute infra, like this:

import runhouse as rh
from diffusers import StableDiffusionPipeline

def sd_generate(prompt, **inference_kwargs):
    model = StableDiffusionPipeline.from_pretrained("stabilityai/stable-diffusion-2-base").to("cuda")
    return model(prompt, **inference_kwargs).images

if __name__ == "__main__":
    gpu = rh.cluster(name="rh-a10x", instance_type="A10G:1", provider="aws")
    sd_env = rh.env(reqs=["torch", "transformers", "diffusers"], name="sd_generate", working_dir="./")

    # Deploy the function and environment (syncing over local code changes and installing dependencies)
    remote_sd_generate = rh.function(sd_generate).to(gpu, env=sd_env)

    # This call is actually an HTTP request to the app running on the remote server
    imgs = remote_sd_generate("A hot dog made out of matcha.")
    imgs[0].show()

    # You can also call it over HTTP directly, e.g. from other machines or languages
    print(remote_sd_generate.endpoint())

With the above simple structure you can run, deploy, and share:

  • 🛠️ AI primitives: Preprocessing, training, fine-tuning, evaluation, inference
  • 🚀 Higher-order services: Multi-stage inference (e.g. RAG), e2e workflows
  • 🦺 Controls and safety: PII obfuscation, content moderation, drift detection
  • 📊 Data services: ETL, caching, data augmentation, data validation

🛋️ Share Apps and Resources with Runhouse Den

You can unlock unique portability and sharing features by creating a Runhouse Den account. Log in from anywhere to save, share, and load resources:

runhouse login

or from Python:

import runhouse as rh
rh.login()

Extending the example above to share and load our app via Den:

remote_sd_generate.share(["my_pal@email.com"])

# The service stub can now be reloaded from anywhere, always at yours and your collaborators' fingertips
# Notice this code doesn't need to change if you update, move, or scale the service
remote_sd_generate = rh.function("/your_username/sd_generate")
imgs = remote_sd_generate("More matcha hotdogs.")
imgs[0].show()

🏗️ Supported Compute Infra

Please reach out (first name at run.house) if you don't see your favorite compute here.

  • Local - Supported
  • Single box - Supported
  • Ray cluster - Supported
  • Kubernetes (K8S) - Supported
  • Amazon Web Services (AWS)
    • EC2 - Supported
    • EKS - Supported
    • SageMaker - Supported
    • Lambda - Alpha
  • Google Cloud Platform (GCP)
    • GCE - Supported
    • GKE - Supported
  • Microsoft Azure
    • VMs - Supported
    • AKS - Supported
  • Lambda Labs - Supported
  • Modal Labs - Planned
  • Slurm - Exploratory

👨‍🏫 Learn More

🐣 Getting Started: Installation, setup, and a quick walkthrough.

📖 Docs: Detailed API references, basic API examples and walkthroughs, end-to-end tutorials, and high-level architecture overview.

🎪 Funhouse: Standalone ML apps and examples to try with Runhouse, like image generation models, LLMs, launching Gradio spaces, and more!

👩‍💻 Blog: Deep dives into Runhouse features, use cases, and the future of AI infra.

👾 Discord: Join our community to ask questions, share ideas, and get help.

𝑋 Twitter: Follow us for updates and announcements.

🙋‍♂️ Getting Help

Message us on Discord, email us (first name at run.house), or create an issue.

👷‍♀️ Contributing

We welcome contributions! Please check out contributing if you're interested.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

runhouse-0.0.24.tar.gz (289.0 kB view details)

Uploaded Source

Built Distribution

runhouse-0.0.24-py3-none-any.whl (351.3 kB view details)

Uploaded Python 3

File details

Details for the file runhouse-0.0.24.tar.gz.

File metadata

  • Download URL: runhouse-0.0.24.tar.gz
  • Upload date:
  • Size: 289.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.13

File hashes

Hashes for runhouse-0.0.24.tar.gz
Algorithm Hash digest
SHA256 1b69a2496ee89c33155cb08e6fd3c6aa2ae082602eb74bfcd55586f60ccff768
MD5 2cb6f413f792be088c117077647ffd02
BLAKE2b-256 957a1965f543c0c04016d4f091ce535a1d9b7aad084f1c64892985cf858703be

See more details on using hashes here.

File details

Details for the file runhouse-0.0.24-py3-none-any.whl.

File metadata

  • Download URL: runhouse-0.0.24-py3-none-any.whl
  • Upload date:
  • Size: 351.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.13

File hashes

Hashes for runhouse-0.0.24-py3-none-any.whl
Algorithm Hash digest
SHA256 ef00732f94498f1c1edf320cbc7d9355b9e782bab2dd88471505c359c9d8b92f
MD5 1e8ae9b321ac61eaa6be385b6f69e0bf
BLAKE2b-256 f8585fc9ed65e87f99e1dac19b9359b4f2bcad96120d009218d31494cef5ea91

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page