Personal AI Gateway — Python AI assistant with multi-model routing, web UI, and 66 built-in tools. Core: uvicorn + fastapi + websockets.
Project description
SalmAlm — Private AI Gateway
Self-hosted, privacy-first AI gateway. Your data never leaves your machine.
Quick Start
pip install salmalm
salmalm start
# Open http://localhost:8000
Features
- 🔒 100% Local — no data sent to third parties
- 🤖 Multi-LLM — Claude, GPT, Gemini in one interface
- 📚 RAG — chat with your own documents
- 🛠️ 62 built-in tools — web search, file ops, code execution
- 🔐 Vault encryption for sensitive data
- 🐳 Docker ready
Docker
docker-compose up -d
Requirements
- Python 3.10+
- API key for at least one LLM provider
😈 SalmAlm
Self-hosted personal AI gateway — one pip install, no Docker, no Node.js.
Features
- Multi-provider LLM routing — OpenAI, Anthropic, Google, xAI, Ollama with 3-tier auto-routing (simple / moderate / complex)
- Automatic failover + circuit breaker — transparent retry across providers; unhealthy endpoints are isolated
- RAG — BM25 + semantic search with Reciprocal Rank Fusion (RRF); indexes your local files automatically
- Vault encryption — AES-256-GCM with PBKDF2-200K key derivation; opt-in per secret
- OAuth2 — Google and Anthropic social login flows
- WebSocket streaming — real-time token streaming to the web UI
- Multi-user auth — JWT-based session management with per-user quotas
- Cost tracking + daily quotas — per-model token accounting with configurable daily spend caps
- Prometheus metrics —
/metricsendpoint; drop-in for any Grafana stack - SQLite audit log — WAL mode; every request, tool call, and auth event is logged
- 62 built-in tools — shell exec, file I/O, web search (Brave), browser automation, TTS/STT, image gen, cron, and more
Quick Start
pip install salmalm
salmalm start
# → http://localhost:18800
A Setup Wizard opens on first launch. Paste an API key, pick a model — done.
Recommended: use
pipx install salmalmto avoid dependency conflicts.
Configuration
All configuration is via environment variables. No config files required.
| Variable | Default | Description |
|---|---|---|
SALMALM_PORT |
18800 |
HTTP listen port |
SALMALM_BIND |
127.0.0.1 |
Bind address (0.0.0.0 for LAN access) |
SALMALM_SECRET |
(none) | Master secret for Vault + JWT signing (set this!) |
SALMALM_ALLOW_SHELL |
0 |
Enable shell operators in tool exec (1 to opt in) |
SALMALM_PYTHON_EVAL |
0 |
Enable Python eval tool (1 to opt in) |
SALMALM_DAILY_BUDGET |
(none) | Daily spend cap in USD, e.g. 2.00 |
API Reference
| Method | Path | Description |
|---|---|---|
POST |
/api/chat |
Send a message; returns SSE stream |
GET |
/api/sessions |
List chat sessions |
DELETE |
/api/sessions/{id} |
Delete a session |
GET |
/api/tools |
List available tools and their schemas |
GET |
/api/models |
List discovered models across all providers |
GET |
/api/costs |
Cost summary (today / 30-day) |
GET |
/metrics |
Prometheus metrics endpoint |
GET |
/api/vault |
List vault entries (values redacted) |
POST |
/api/vault |
Store an encrypted secret |
GET |
/api/audit |
Recent audit log entries |
Architecture
Client (Browser / Telegram / Discord)
│
▼ HTTP + WebSocket
┌───────────────────────────────────────────┐
│ SalmAlm │
│ │
│ ┌─────────────┐ ┌───────────────────┐ │
│ │ 3-Tier │ │ Engine Pipeline │ │
│ │ LLM Router │──▶│ classify → route │ │
│ │ + Failover │ │ → context → exec │ │
│ └─────────────┘ └───────────────────┘ │
│ │ │
│ ┌──────▼──────────────────────────────┐ │
│ │ Providers │ │
│ │ OpenAI · Anthropic · Google · xAI │ │
│ │ Ollama · LM Studio · vLLM │ │
│ └─────────────────────────────────────┘ │
│ │
│ RAG (BM25 + Semantic + RRF) │
│ Vault (AES-256-GCM) │
│ JWT Auth · OAuth2 │
│ 62 Tools · Cron · Sub-Agents │
│ SQLite Audit (WAL) · Prometheus /metrics │
└───────────────────────────────────────────┘
Development
git clone https://github.com/hyunjun6928-netizen/salmalm.git
cd salmalm
pip install -e ".[dev]"
pytest tests/ -q --timeout=30 -x \
--ignore=tests/test_multi_tenant.py \
--ignore=tests/test_fresh_install_e2e.py
See CONTRIBUTING.md for guidelines.
License
MIT © 2024 hyunjun6928-netizen
SalmAlm = 삶 (Life) + 앎 (Knowledge)
Your life, understood by AI.
SalmAlm — Private AI Gateway
Self-hosted, privacy-first AI gateway. Your data never leaves your machine.
Quick Start
pip install salmalm
salmalm start
# Open http://localhost:8000
Features
- 🔒 100% local — no data sent to third parties
- 🤖 Multi-LLM — Claude, GPT, Gemini in one place
- 📚 RAG — chat with your documents
- 🛠️ 62 built-in tools
- 🔐 Vault encryption for sensitive data
Docker
docker-compose up -d
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file salmalm-0.29.51.tar.gz.
File metadata
- Download URL: salmalm-0.29.51.tar.gz
- Upload date:
- Size: 1.0 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7c235612bcca9f0f52f83f6f0b52a3deaa1d9b2e39f442f1779a069205913906
|
|
| MD5 |
e223bf405c68bc4734c6461607104e73
|
|
| BLAKE2b-256 |
99d4b0c7c14cca4c9c98138126b7911e531dc06025a9bdb7be9d5b1c66f75e51
|
File details
Details for the file salmalm-0.29.51-py3-none-any.whl.
File metadata
- Download URL: salmalm-0.29.51-py3-none-any.whl
- Upload date:
- Size: 995.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
31524c30c3c568d01d90923d64608150bda417a9c65fb570cd3c69031f36ccf6
|
|
| MD5 |
6fafb2176434053e6ca151e6d1d54a2f
|
|
| BLAKE2b-256 |
32b8f541881fbf4430b253924889d3572d773cfa08b0500644cc6695965b0f90
|