Personal LLM deployments made simple.
Project description
LLM-Launchpad
Spin up LLM endpoints on Modal for local and personal use
- Deploy any open-source models from the HuggingFace model hub.
- Open-AI compatible endpoints via llama.cpp (preferred) and vLLM backends.
- Direct integration with OpenCode.
Prerequisites
- uv for Python, environment, and CLI tool management (install with
curl -LsSf https://astral.sh/uv/install.sh | sh) - Modal account
- Hugging Face account
- Optional: OpenCode (install with
curl -fsSL https://opencode.ai/install | bash)
Quickstart
Get up and running in four steps:
-
Install the CLI so
llm-launchpadis available in your shell:uv tool install llm-launchpad llm-launchpad --help
-
Authenticate Modal:
modal setup -
Authenticate Hugging Face:
huggingface-cli login -
Launch the TUI:
llm-launchpad
Why a TUI?
Setting up LLM endpoints usually means juggling model names, container images, GPU choices, warmup checks, logs, and endpoint details across several commands. The TUI keeps that flow in one place.
From the TUI you can:
- Launch any open-source model on the Hugging Face model hub without memorizing Modal or backend-specific commands
- Manage multiple deployed instances and inspect their status
- Integrate the final OpenAI-compatible base URL and model ID into your workflows like OpenCode after deployment.
OpenCode integration
LLM-Launchpad automatically detects local installation of OpenCode and will setup your OpenCode config with the final OpenAI-compatible base URL and model ID after deployment.
Development setup
If you are working from a clone and want the command available directly while editing the source:
git clone https://github.com/ThomasRochefortB/llm-launchpad.git
cd llm-launchpad
uv tool install --editable .
llm-launchpad --help
If you need the full project environment for tests or local development workflows:
uv sync
uv run pytest
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file llm_launchpad-1.0.1.tar.gz.
File metadata
- Download URL: llm_launchpad-1.0.1.tar.gz
- Upload date:
- Size: 192.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: uv/0.7.4
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
030dad783db93868c02d16b375dfdf461674bfcfa347dd8d36f5119863d57617
|
|
| MD5 |
3635129a67b11e3346b8da96520b1a02
|
|
| BLAKE2b-256 |
452890227944e55c71057aa8cc8effa519b0a4272e5acce08fb71ee2a2164459
|
File details
Details for the file llm_launchpad-1.0.1-py3-none-any.whl.
File metadata
- Download URL: llm_launchpad-1.0.1-py3-none-any.whl
- Upload date:
- Size: 144.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: uv/0.7.4
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
8948dab051b6482d6c937274abb2d6449e2ddb202544abaf882997327bedb7b8
|
|
| MD5 |
aeaab30835d94542710a33570fec2000
|
|
| BLAKE2b-256 |
c0b10c0b81b90531832e01c56b5983c725cdec71b9459de62d96a72b50832527
|