Skip to main content

CLI tool to expose LLMs through messaging clients (Telegram, Discord, Slack, etc.)

Project description

LLM Expose Logo

Python 3.11+ Language: Python License: MIT Tests Docs

Expose LLM-powered assistants through messaging platforms such as Telegram and Discord

🤖 ← 🐈 ← 🌐 ← 📱 ← 🧙‍♂️


llm-expose gives you a channel-first CLI workflow: configure providers, attach channels, control pairings, and optionally integrate MCP servers for tool-aware completions.

Features

  • Multi-channel support (Telegram and Discord).
  • LiteLLM provider support for broad model compatibility.
  • Local OpenAI-compatible endpoint support.
  • MCP server integration for tool-aware responses.
  • Pairing-based access control per channel.
  • CLI-first setup and operations.

Installation

Quick Install (One-Liner)

Linux & macOS:

curl -fsSL https://raw.githubusercontent.com/edo0xff/llm-expose/main/scripts/install.sh | bash

Windows (PowerShell as Administrator):

powershell -ExecutionPolicy Bypass -Command "iex (New-Object Net.WebClient).DownloadString('https://raw.githubusercontent.com/edo0xff/llm-expose/main/scripts/install-windows.ps1')"

From PyPI

pip install llm-expose

From source

git clone https://github.com/edo0xff/llm-expose.git
cd llm-expose
pip install -e .

Development install

pip install -e '.[dev]'

See scripts/README.md for detailed installation instructions and troubleshooting.

Quick Start

llm-expose is interactive by default, which is usually the fastest path for humans. Use --no-input for headless automation and add -y when the command can require confirmation.

  1. Configure a model:
llm-expose add model
  1. Configure a channel (interactive):
llm-expose add channel
  1. Pair an allowed user/chat ID:
llm-expose add pair 123456789 --channel my-telegram
  1. Start the channel runtime:
llm-expose start

Headless equivalent (CI/scripts):

llm-expose add model --name gpt4o-mini --provider openai --model-id gpt-4o-mini -y --no-input
llm-expose add channel --name my-telegram --client-type telegram --bot-token "123456789:AAExampleTelegramToken" --model-name gpt4o-mini -y --no-input
llm-expose add pair 123456789 --channel my-telegram --no-input
llm-expose start --channel my-telegram -y --no-input

If you are unsure about available options, run:

llm-expose --help
llm-expose add --help
llm-expose start --help

Pairing Model

Incoming chat/channel IDs must be explicitly paired before the service replies.

When an unpaired ID sends a message, the service returns:

This instance is not paired. Run llm-expose add pair <channel-id>

Pairings are stored per channel configuration.

Common pairing commands:

  • llm-expose add pair <id> --channel <channel-name>
  • llm-expose list pairs
  • llm-expose list pairs --channel <channel-name>
  • llm-expose delete pair <id> --channel <channel-name>

Configuration Workflow

llm-expose currently uses CLI commands to persist configuration (models, channels, and MCP settings).

Recommended setup order:

  1. Add one or more models (llm-expose add model ...).
  2. Add one or more channels (llm-expose add channel ...).
  3. Add optional MCP servers (llm-expose add mcp ...).
  4. Pair allowed IDs (llm-expose add pair ...).
  5. Run exposure service (llm-expose start ...).

Development

Run quality checks:

ruff check .
black --check .
mypy llm_expose
pytest

Roadmap

  • PyPI release automation.
  • Hosted docs site with architecture and API references.
  • More channel adapters and provider presets.

Contributing

See CONTRIBUTING.md.

License

MIT. See LICENSE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm_expose-0.1.2.tar.gz (78.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llm_expose-0.1.2-py3-none-any.whl (67.7 kB view details)

Uploaded Python 3

File details

Details for the file llm_expose-0.1.2.tar.gz.

File metadata

  • Download URL: llm_expose-0.1.2.tar.gz
  • Upload date:
  • Size: 78.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for llm_expose-0.1.2.tar.gz
Algorithm Hash digest
SHA256 83e6b08aff6a29f6a4a40e24f14c05583fc9ae1768c9a727fb3de6f1b3fdbe47
MD5 756408ddaf9187cf4e126c7be79e6c1f
BLAKE2b-256 20f09d1422769449e9bf010b560af80843942cf5a0ac05626b3003fa92e02a97

See more details on using hashes here.

Provenance

The following attestation bundles were made for llm_expose-0.1.2.tar.gz:

Publisher: publish.yml on edo0xff/llm-expose

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file llm_expose-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: llm_expose-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 67.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for llm_expose-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 ab34b22a85670aae1c54e2bf5ef95ab5bc6c5214b897c67527eb831276c2904f
MD5 768e2cc3fc23a6096a34a18d56c9138c
BLAKE2b-256 c9c8de801036be5b065812e8463e8bc84b0281ba33cb86204a40c31dfdd0d9c0

See more details on using hashes here.

Provenance

The following attestation bundles were made for llm_expose-0.1.2-py3-none-any.whl:

Publisher: publish.yml on edo0xff/llm-expose

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page