Skip to main content

Python-native infrastructure for the cloud: LaunchFlow provides a Python SDK that automatically creates and connects to production-ready infrastructure (such as Postgres, Redis, etc..) in your own cloud account. LaunchFlow completely removes the need for DevOps allowing you to focus on your application logic.

Project description

Open Source Deployment Tool for AWS and GCP

📖 Docs   |   ⚡ Quickstart   |   👋 Slack

LaunchFlow is an open source command line tool that deploys APIs, web apps, and other applications to AWS / GCP with minimal configuration. All of the deployment options are configured by default, but fully customizable with Python + Terraform.

  • Serverless Deployments
  • Auto-Scaling VMs
  • Kubernetes Clusters (in preview)
  • Static Sites (in preview)
  • Terraform Resources
  • Pulumi Resources (coming soon)
  • Custom Resources (coming soon)

Use the Python SDK to define your infrastructure in code, then run lf deploy to deploy everything to a dedicated VPC environment in your cloud account.

🧠 Concepts

Services - Docs

Services allow you to deploy APIs, web apps, background workers and other types of applications to your cloud account with minimal setup.

[!NOTE] LaunchFlow is not just for deploying Python apps. The Python SDK is used to define your infrastructure in code, but you can deploy any application that runs on a VM, container, or serverless environment.

Python is just the language for your cloud configuration, similar to how Terraform uses HCL.

Click the dropdown below to see the service types that are currently supported.

Services Types
  • Serverless APIs
    • (AWS) Lambda Service - Docs
    • (GCP) Cloud Run Service - Docs
  • Auto-Scaling VMs
    • (AWS) ECS Fargate Service - Docs
    • (GCP) Compute Engine Service - Docs
  • Kubernetes Clusters
    • (AWS) EKS - coming soon
    • (GCP) GKE - Docs
  • Static Websites
    • (AWS) S3 Static Site - coming soon
    • (GCP) GCS Static Site with Load Balancer - coming soon
    • (GCP) Firebase Static Site - coming soon

Resources - Docs

Resources are the cloud services that your application uses, such as databases, storage, queues, and secrets. LaunchFlow provides a simple way to define, manage, and use these resources in your application.

Click the dropdown below to see the resource types that are currently supported.

Resource Types
  • Cloud Storage
    • (AWS) S3 Bucket - Docs
    • (GCP) GCS Bucket - Docs
  • Databases (Postgres, MySQL, etc.)
    • (AWS) RDS - Docs
    • (GCP) Cloud SQL - Docs
  • Redis
    • (AWS) ElastiCache Redis - Docs
    • (GCP) Memorystore Redis - Docs
  • Task Queues
    • (AWS) SQS Queue - Docs
    • (GCP) Pub/Sub - Docs
    • (GCP) Cloud Tasks - Docs
  • Secrets
    • (AWS) Secrets Manager - Docs
    • (GCP) Secret Manager - Docs
  • Custom Domains
    • (AWS) Route 53 - coming soon
    • (GCP) Custom Domain Mapping - Docs
  • Monitoring & Alerts
    • (AWS) CloudWatch - coming soon
    • (GCP) StackDriver - coming soon
  • Custom Terraform Resources - coming soon
  • Custom Pulumi Resources - coming soon

Environments - Docs

Environments group Services and Resources inside a private network (VPC) on either GCP or AWS. You can create multiple environments for different stages of your workflow (e.g. development, staging, production) and switch between them with a single command.

⚙️ Installation

pip install launchflow

🚀 Quickstart

Deploy FastAPI to ECS Fargate on AWS:

Step 1. Create a new Python file (e.g. main.py) and add the following code:

from fastapi import FastAPI

app = FastAPI()

@app.get("/")
def index():
    return f'Hello from {lf.environment}!'

Step 2. Add a Service type to your Python file:

from fastapi import FastAPI
import launchflow as lf

app = FastAPI()

@app.get("/")
def index():
    return f'Hello from {lf.environment}!'

# Deploy this FastAPI app to ECS Fargate on AWS
api = lf.aws.ECSFargateService("my-api")

Step 3. Run the lf deploy command to deploy your infrastructure:

lf deploy

This command will do the following:

  1. Generate a Dockerfile and launchflow.yaml file (if you don't have one)
  2. Create a new VPC (Environment) in your AWS account (if you don't have one)
  3. Create a new ECS Fargate service and task definition (if you don't have one)
  4. Create a new Application Load Balancer and Route 53 DNS record (if you don't have one)
  5. Build a Docker image and push it to ECR
  6. Deploy your FastAPI app to the new ECS Fargate service
  7. Output the URL & DNS settings of your new FastAPI app

Step 4. Add a Resource type and customize the Service:

from fastapi import FastAPI
import launchflow as lf

# Resource permissions are automatically configured for you
bucket = lf.gcp.S3Bucket("my-bucket")

app = FastAPI()

@app.get("/")
def index():
    bucket.upload_from_string(f"Hello from {lf.environment}!", "hello.txt")
    return bucket.download_file("hello.txt").decode()

# You can customize the Fargate service with Python
api = lf.aws.ECSFargateService("my-api", domain="your-domain.com", memory=512, cpu=256)

Step 5. Run the lf deploy command to deploy your updated infrastructure:

lf deploy

📖 Examples

Click the dropdowns below to see the example's code.

Deploy FastAPI to ECS Fargate (AWS)
from fastapi import FastAPI
import launchflow as lf

app = FastAPI()

@app.get("/")
def index():
    return f'Hello from {lf.environment}!'

# Deploy this FastAPI app to ECS Fargate on AWS
api = lf.aws.ECSFargateService("my-api", domain="your-domain.com")
Deploy FastAPI to Cloud Run (GCP)
from fastapi import FastAPI
import launchflow as lf

app = FastAPI()

@app.get("/")
def index():
    return f'Hello from {lf.environment}!'

# Deploy Postgres hosted on (GCP) Cloud SQL
api = lf.gcp.CloudRunService("my-api", domain="your-domain.com")
Deploy Postgres to RDS & EC2 (AWS)
import launchflow as lf

# Create / Connect to a Postgres Cluster on CloudSQL
postgres = lf.aws.RDSPostgres("postgres-cluster", disk_size_gb=10)

# Or on a Compute Engine VM
postgres = lf.aws.ComputeEnginePostgres("postgres-vm")

if __name__ == "__main__":
    # Built-in utility methods for using Postgres
    postgres.query("SELECT * FROM my_table")

    # Built-in connectors for Python ORMs
    postgres.sqlalchemy_engine()
    postgres.django_settings()
Deploy Postgres to Cloud SQL & Compute Engine (GCP)
import launchflow as lf

# Create / Connect to a Postgres Cluster on CloudSQL
postgres = lf.gcp.CloudSQLPostgres("postgres-cluster", disk_size_gb=10)

# Or on a Compute Engine VM
postgres = lf.gcp.ComputeEnginePostgres("postgres-vm")

if __name__ == "__main__":
    # Built-in utility methods for using Postgres
    postgres.query("SELECT * FROM my_table")

    # Built-in connectors for Python ORMs
    postgres.sqlalchemy_engine()
    postgres.django_settings()

👀 Coming Soon

Deploy a static React app to a CDN (GCP)

[!IMPORTANT] This example is not yet available in the LaunchFlow Python SDK.

import launchflow as lf

# Deploy a static React app to a GCS Bucket with a CDN
bucket = lf.gcp.BackendBucket(
    "react-app", "./dst" domain=f"{lf.environment}.app.launchflow.com"
)

if __name__ == "__main__":
   # Use Python to easily automate non-Python applications
  print(f"Bucket URL: {bucket.url}")
Full on scripting with Python (GCP)

[!IMPORTANT] This example is not yet available in the LaunchFlow Python SDK.

import launchflow as lf


backend = lf.gcp.CloudRunService(
    "fastapi-api", domain=f"{lf.environment}.api.launchflow.com"
)

frontend = lf.gcp.BackendBucket(
    "react-static-app",
    static_directory="./dst",
    domain=f"{lf.environment}.console.launchflow.com",
    env={
        "LAUNCHFLOW_API_URL": backend.url
    }
)

result = lf.deploy(backend, frontend, environment="dev")

if not result.successful:
    print(result.error)
    exit(1)

print(f"Frontend URL: {frontend.url}")
print(f"Backend URL: {backend.url}")

Don't see what you're looking for?

Reach out to team@launchflow.com to speed up development of the feature you need. Most of the unfinished features are already in development and can be completed in under a week - we just need to know what to prioritize!

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

launchflow-0.4.13.tar.gz (320.8 kB view details)

Uploaded Source

Built Distribution

launchflow-0.4.13-py3-none-any.whl (469.1 kB view details)

Uploaded Python 3

File details

Details for the file launchflow-0.4.13.tar.gz.

File metadata

  • Download URL: launchflow-0.4.13.tar.gz
  • Upload date:
  • Size: 320.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.6

File hashes

Hashes for launchflow-0.4.13.tar.gz
Algorithm Hash digest
SHA256 62d5576cfefc875655ba2f241de6390cacb0f44d52ac569dbadfef3a66b468cf
MD5 216b44bb5625d317b42359cbb0f53ad4
BLAKE2b-256 b162668ee9565ece84cf9f8b73f830dfac0cf997fd81d8e15e2cea990243aa9d

See more details on using hashes here.

File details

Details for the file launchflow-0.4.13-py3-none-any.whl.

File metadata

  • Download URL: launchflow-0.4.13-py3-none-any.whl
  • Upload date:
  • Size: 469.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.6

File hashes

Hashes for launchflow-0.4.13-py3-none-any.whl
Algorithm Hash digest
SHA256 051c82818fc69c474dbfb36374afb9ecd3448baffc6c4a512efaa69f5c2f1f12
MD5 0efe59b8d48faabbc16d6152e00f7a74
BLAKE2b-256 6cd8dd5929a84df5abe226a60cf8a68c69299047566aff87e51feb0d04878c02

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page