Skip to main content

Personal LLM deployments made simple.

Project description

LLM-Launchpad

Spin up LLM endpoints on Modal for local and personal use

LLM Launchpad header

  • Deploy any open-source models from the HuggingFace model hub.
  • Open-AI compatible endpoints via llama.cpp (preferred) and vLLM backends.
  • Direct integration with OpenCode.

Prerequisites

  • uv for Python, environment, and CLI tool management (install with curl -LsSf https://astral.sh/uv/install.sh | sh)
  • Modal account
  • Hugging Face account
  • Optional: OpenCode (install with curl -fsSL https://opencode.ai/install | bash)

Quickstart

Get up and running in four steps:

  1. Install the CLI so llm-launchpad is available in your shell:

    uv tool install llm-launchpad
    llm-launchpad --help
    
  2. Authenticate Modal:

    modal setup
    
  3. Authenticate Hugging Face:

    huggingface-cli login
    
  4. Launch the TUI:

    llm-launchpad
    

Why a TUI?

Setting up LLM endpoints usually means juggling model names, container images, GPU choices, warmup checks, logs, and endpoint details across several commands. The TUI keeps that flow in one place.

From the TUI you can:

  • Launch any open-source model on the Hugging Face model hub without memorizing Modal or backend-specific commands
  • Manage multiple deployed instances and inspect their status
  • Integrate the final OpenAI-compatible base URL and model ID into your workflows like OpenCode after deployment.

OpenCode integration

LLM-Launchpad automatically detects local installation of OpenCode and will setup your OpenCode config with the final OpenAI-compatible base URL and model ID after deployment.

Development setup

If you are working from a clone and want the command available directly while editing the source:

git clone https://github.com/ThomasRochefortB/llm-launchpad.git
cd llm-launchpad
uv tool install --editable .
llm-launchpad --help

If you need the full project environment for tests or local development workflows:

uv sync
uv run pytest

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm_launchpad-1.0.1.tar.gz (192.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llm_launchpad-1.0.1-py3-none-any.whl (144.1 kB view details)

Uploaded Python 3

File details

Details for the file llm_launchpad-1.0.1.tar.gz.

File metadata

  • Download URL: llm_launchpad-1.0.1.tar.gz
  • Upload date:
  • Size: 192.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: uv/0.7.4

File hashes

Hashes for llm_launchpad-1.0.1.tar.gz
Algorithm Hash digest
SHA256 030dad783db93868c02d16b375dfdf461674bfcfa347dd8d36f5119863d57617
MD5 3635129a67b11e3346b8da96520b1a02
BLAKE2b-256 452890227944e55c71057aa8cc8effa519b0a4272e5acce08fb71ee2a2164459

See more details on using hashes here.

File details

Details for the file llm_launchpad-1.0.1-py3-none-any.whl.

File metadata

File hashes

Hashes for llm_launchpad-1.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 8948dab051b6482d6c937274abb2d6449e2ddb202544abaf882997327bedb7b8
MD5 aeaab30835d94542710a33570fec2000
BLAKE2b-256 c0b10c0b81b90531832e01c56b5983c725cdec71b9459de62d96a72b50832527

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page