One command launcher for running OpenCode with a local llama.cpp model.
Project description
OpenCode llama.cpp Launcher
Launch OpenCode with a local model served by
llama.cpp. The launcher starts
llama-server, wires OpenCode to it, and cleans up when your session ends.
Requirements
The launcher finds llama-server on PATH, or you can set llama_server in
your config.
Install OpenCode using its GitHub installation instructions. Install llama.cpp using its installation guide.
Install
For most users, install with pipx:
pipx install opencode-llama-cpp-launcher
Or install with pip:
python -m pip install opencode-llama-cpp-launcher
Check that the required external binaries are available:
opencode-llama doctor
Configure
Create opencode-llama.yaml in the project where you want OpenCode to run, or
create ~/.config/opencode-llama.yaml for a user-wide default:
model: /absolute/path/to/model.gguf
ctx_size: 8192
# Optional
port: 8080
llama_server: /optional/path/to/llama-server
Config lookup order:
- The path passed with
--config opencode-llama.yamloropencode-llama.ymlin the project directory~/.config/opencode-llama.yaml
Usage
Run with an explicit config file:
opencode-llama --config opencode-llama.yaml
Or pass the model directly:
opencode-llama --model /absolute/path/to/model.gguf
Useful options:
opencode-llama --help
opencode-llama --dry-run
opencode-llama --config opencode-llama.yaml
opencode-llama --port 9001
opencode-llama --ctx-size 8192
opencode-llama --llama-server /absolute/path/to/llama-server
If llama-server fails before becoming healthy, the launcher includes a bounded
tail of the server's startup output in the error message. Successful runs stay
quiet.
Development
Install dependencies from this repository:
uv sync --dev
Run the test suite:
uv run pytest
Before publishing, check for local files:
git status --short --ignored
Do not commit local launcher configs, virtual environments, caches, build artifacts, or model paths.
License
MIT
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file opencode_llama_cpp_launcher-0.1.5.tar.gz.
File metadata
- Download URL: opencode_llama_cpp_launcher-0.1.5.tar.gz
- Upload date:
- Size: 328.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c9a78e6a1a55ad5705bb8e3218bdd9b5911ba9237e18ec26c9f76cac672a61a5
|
|
| MD5 |
7cb661add5ee2e74e4c6a71a34341236
|
|
| BLAKE2b-256 |
43d7dc1e67565d214e7a6aa59b6b62f1a8027f168a1f47ad241dd01b44f3a20d
|
Provenance
The following attestation bundles were made for opencode_llama_cpp_launcher-0.1.5.tar.gz:
Publisher:
release.yml on ribomo/opencode-llama-cpp-launcher
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
opencode_llama_cpp_launcher-0.1.5.tar.gz -
Subject digest:
c9a78e6a1a55ad5705bb8e3218bdd9b5911ba9237e18ec26c9f76cac672a61a5 - Sigstore transparency entry: 1483388884
- Sigstore integration time:
-
Permalink:
ribomo/opencode-llama-cpp-launcher@c3265d9634300e2ec7bc16c11779b819ae8d2d90 -
Branch / Tag:
refs/tags/v0.1.5 - Owner: https://github.com/ribomo
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@c3265d9634300e2ec7bc16c11779b819ae8d2d90 -
Trigger Event:
push
-
Statement type:
File details
Details for the file opencode_llama_cpp_launcher-0.1.5-py3-none-any.whl.
File metadata
- Download URL: opencode_llama_cpp_launcher-0.1.5-py3-none-any.whl
- Upload date:
- Size: 15.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9d7122d4dd3bd08cca56628383878f2dfdfba11ce93eb06040102c1e804c05f3
|
|
| MD5 |
3d2c39a8016766a278a6d6891f46206a
|
|
| BLAKE2b-256 |
4b5e109f7e16142489214a9d31e66330bcb690bd8e1983adf3c66830c6b7a478
|
Provenance
The following attestation bundles were made for opencode_llama_cpp_launcher-0.1.5-py3-none-any.whl:
Publisher:
release.yml on ribomo/opencode-llama-cpp-launcher
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
opencode_llama_cpp_launcher-0.1.5-py3-none-any.whl -
Subject digest:
9d7122d4dd3bd08cca56628383878f2dfdfba11ce93eb06040102c1e804c05f3 - Sigstore transparency entry: 1483388901
- Sigstore integration time:
-
Permalink:
ribomo/opencode-llama-cpp-launcher@c3265d9634300e2ec7bc16c11779b819ae8d2d90 -
Branch / Tag:
refs/tags/v0.1.5 - Owner: https://github.com/ribomo
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@c3265d9634300e2ec7bc16c11779b819ae8d2d90 -
Trigger Event:
push
-
Statement type: