CLI tool to expose LLMs through messaging clients (Telegram, Discord, Slack, etc.)
Project description
Expose LLM-powered assistants through messaging platforms such as Telegram and Discord
🤖 ← 🐈 ← 🌐 ← 📱 ← 🧙♂️
llm-expose gives you a channel-first CLI workflow: configure providers, attach channels, control pairings, and optionally integrate MCP servers for tool-aware completions.
Features
- Multi-channel support (Telegram and Discord).
- LiteLLM provider support for broad model compatibility.
- Local OpenAI-compatible endpoint support.
- MCP server integration for tool-aware responses.
- Pairing-based access control per channel.
- CLI-first setup and operations.
Installation
Quick Install (One-Liner)
Linux & macOS:
curl -fsSL https://raw.githubusercontent.com/edo0xff/llm-expose/main/scripts/install.sh | bash
Windows (PowerShell as Administrator):
powershell -ExecutionPolicy Bypass -Command "iex (New-Object Net.WebClient).DownloadString('https://raw.githubusercontent.com/edo0xff/llm-expose/main/scripts/install-windows.ps1')"
From PyPI
pip install llm-expose
From source
git clone https://github.com/edo0xff/llm-expose.git
cd llm-expose
pip install -e .
Development install
pip install -e '.[dev]'
See scripts/README.md for detailed installation instructions and troubleshooting.
Quick Start
llm-expose is interactive by default, which is usually the fastest path for humans.
Use --no-input for headless automation and add -y when the command can require confirmation.
- Configure a model:
llm-expose add model
- Configure a channel (interactive):
llm-expose add channel
- Pair an allowed user/chat ID:
llm-expose add pair 123456789 --channel my-telegram
- Start the channel runtime:
llm-expose start
Headless equivalent (CI/scripts):
llm-expose add model --name gpt4o-mini --provider openai --model-id gpt-4o-mini -y --no-input
llm-expose add channel --name my-telegram --client-type telegram --bot-token "123456789:AAExampleTelegramToken" --model-name gpt4o-mini -y --no-input
llm-expose add pair 123456789 --channel my-telegram --no-input
llm-expose start --channel my-telegram -y --no-input
If you are unsure about available options, run:
llm-expose --help
llm-expose add --help
llm-expose start --help
Pairing Model
Incoming chat/channel IDs must be explicitly paired before the service replies.
When an unpaired ID sends a message, the service returns:
This instance is not paired. Run llm-expose add pair <channel-id>
Pairings are stored per channel configuration.
Common pairing commands:
llm-expose add pair <id> --channel <channel-name>llm-expose list pairsllm-expose list pairs --channel <channel-name>llm-expose delete pair <id> --channel <channel-name>
Configuration Workflow
llm-expose currently uses CLI commands to persist configuration (models, channels, and MCP settings).
Recommended setup order:
- Add one or more models (
llm-expose add model ...). - Add one or more channels (
llm-expose add channel ...). - Add optional MCP servers (
llm-expose add mcp ...). - Pair allowed IDs (
llm-expose add pair ...). - Run exposure service (
llm-expose start ...).
Development
Run quality checks:
ruff check .
black --check .
mypy llm_expose
pytest
Roadmap
- PyPI release automation.
- Hosted docs site with architecture and API references.
- More channel adapters and provider presets.
Contributing
See CONTRIBUTING.md.
License
MIT. See LICENSE.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file llm_expose-0.1.2.tar.gz.
File metadata
- Download URL: llm_expose-0.1.2.tar.gz
- Upload date:
- Size: 78.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
83e6b08aff6a29f6a4a40e24f14c05583fc9ae1768c9a727fb3de6f1b3fdbe47
|
|
| MD5 |
756408ddaf9187cf4e126c7be79e6c1f
|
|
| BLAKE2b-256 |
20f09d1422769449e9bf010b560af80843942cf5a0ac05626b3003fa92e02a97
|
Provenance
The following attestation bundles were made for llm_expose-0.1.2.tar.gz:
Publisher:
publish.yml on edo0xff/llm-expose
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
llm_expose-0.1.2.tar.gz -
Subject digest:
83e6b08aff6a29f6a4a40e24f14c05583fc9ae1768c9a727fb3de6f1b3fdbe47 - Sigstore transparency entry: 1093791959
- Sigstore integration time:
-
Permalink:
edo0xff/llm-expose@cc49f8fabe4bf7dd5c76503f214b19e8a934f3d1 -
Branch / Tag:
refs/tags/v0.1.2 - Owner: https://github.com/edo0xff
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@cc49f8fabe4bf7dd5c76503f214b19e8a934f3d1 -
Trigger Event:
push
-
Statement type:
File details
Details for the file llm_expose-0.1.2-py3-none-any.whl.
File metadata
- Download URL: llm_expose-0.1.2-py3-none-any.whl
- Upload date:
- Size: 67.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ab34b22a85670aae1c54e2bf5ef95ab5bc6c5214b897c67527eb831276c2904f
|
|
| MD5 |
768e2cc3fc23a6096a34a18d56c9138c
|
|
| BLAKE2b-256 |
c9c8de801036be5b065812e8463e8bc84b0281ba33cb86204a40c31dfdd0d9c0
|
Provenance
The following attestation bundles were made for llm_expose-0.1.2-py3-none-any.whl:
Publisher:
publish.yml on edo0xff/llm-expose
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
llm_expose-0.1.2-py3-none-any.whl -
Subject digest:
ab34b22a85670aae1c54e2bf5ef95ab5bc6c5214b897c67527eb831276c2904f - Sigstore transparency entry: 1093792007
- Sigstore integration time:
-
Permalink:
edo0xff/llm-expose@cc49f8fabe4bf7dd5c76503f214b19e8a934f3d1 -
Branch / Tag:
refs/tags/v0.1.2 - Owner: https://github.com/edo0xff
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@cc49f8fabe4bf7dd5c76503f214b19e8a934f3d1 -
Trigger Event:
push
-
Statement type: