Personal AI Gateway โ Pure Python, zero-dependency AI assistant with multi-model routing, web UI, and 56+ built-in tools
Project description
What is SalmAlm?
SalmAlm is a personal AI gateway โ one Python package that gives you a full-featured AI assistant with a web UI, Telegram/Discord bots, 67 tools, and 10 features you won't find anywhere else.
No Docker. No Node.js. No config files. Just:
pip install salmalm
salmalm
# โ http://localhost:18800
First launch opens a Setup Wizard โ paste an API key, pick a model, done.
โ ๏ธ Don't run
salmalmfrom inside a cloned repo directory โ Python will import the local source instead of the installed package. Run from~or any other directory.
Why SalmAlm?
| Feature | SalmAlm | ChatGPT | OpenClaw | Open WebUI | |
|---|---|---|---|---|---|
| ๐ง | Install complexity | pip install |
N/A | npm + config | Docker |
| ๐ค | Multi-provider routing | โ | โ | โ | โ |
| ๐ง | Self-Evolving Prompt | โ | โ | โ | โ |
| ๐ป | Shadow Mode | โ | โ | โ | โ |
| ๐ | Dead Man's Switch | โ | โ | โ | โ |
| ๐ | Encrypted Vault | โ | โ | โ | โ |
| ๐ฑ | Telegram + Discord | โ | โ | โ | โ |
| ๐งฉ | MCP Marketplace | โ | โ | โ | โ |
| ๐ฆ | Local LLM (Ollama/LM Studio/vLLM) | โ | โ | โ | โ |
| ๐ฆ | Zero dependencies* | โ | N/A | โ | โ |
*stdlib-only core; optional cryptography for AES-256-GCM vault, otherwise pure Python HMAC-CTR fallback
โก Quick Start
# One-liner install
pip install salmalm
# Start (web UI at http://localhost:18800)
salmalm
# Auto-open browser
salmalm --open
# Desktop shortcut (double-click to launch!)
salmalm --shortcut
# Self-update
salmalm --update
# Custom port / external access
SALMALM_PORT=8080 salmalm
SALMALM_BIND=0.0.0.0 salmalm # expose to LAN (see Security section)
Supported Providers
| Provider | Models | Setup |
|---|---|---|
| Anthropic | Claude Opus 4, Sonnet 4, Haiku 4.5 | Web UI โ Settings โ API Keys |
| OpenAI | GPT-5.2, GPT-4.1, o3, o4-mini | Web UI โ Settings โ API Keys |
| Gemini 3 Pro/Flash, 2.5 Pro/Flash | Web UI โ Settings โ API Keys | |
| xAI | Grok-4, Grok-3 | Web UI โ Settings โ API Keys |
| Local LLM | Ollama / LM Studio / vLLM | Web UI โ Settings โ Local LLM |
Local LLM endpoints: Ollama localhost:11434/v1 ยท LM Studio localhost:1234/v1 ยท vLLM localhost:8000/v1
๐ฏ Feature Overview
Core AI
- Smart model routing โ auto-selects by complexity (simpleโHaiku, moderateโSonnet, complexโOpus)
- Extended Thinking โ deep reasoning with budget control
- 5-stage context compaction โ strip binary โ trim tools โ drop old โ truncate โ LLM summarize
- Prompt caching โ Anthropic cache_control for 90% cost reduction
- Model failover โ exponential backoff + retry across providers
- Sub-agent system โ spawn/steer/collect background AI workers
- Infinite loop detection โ 3+ same (tool, args_hash) in last 6 iterations = auto-break
- Irreversible action gate โ email send, calendar delete require explicit confirmation
67 Built-in Tools
Web search (Brave), email (Gmail), calendar (Google), file I/O, shell exec, Python eval, image generation (DALL-E/Aurora), TTS/STT, browser automation (Playwright), RAG search, QR codes, system monitor, OS-native sandbox, mesh networking, canvas preview, and more.
Web UI
- Real-time streaming (WebSocket + SSE fallback)
- Session branching, rollback, search (
Ctrl+K), command palette (Ctrl+Shift+P) - Dark/Light themes, EN/KR i18n (language toggle in settings)
- Image paste/drag-drop with vision, code syntax highlighting
- PWA installable, CSP-compatible (all JS in external
app.js)
Channels
- Web โ full SPA at
localhost:18800 - Telegram โ polling + webhook with inline buttons
- Discord โ bot with thread support and mentions
Admin Panels
๐ Dashboard ยท ๐ Sessions ยท โฐ Cron Jobs ยท ๐ง Memory ยท ๐ฌ Debug ยท ๐ Logs ยท ๐ Docs
โจ 10 Unique Features
| # | Feature | What it does |
|---|---|---|
| 1 | Self-Evolving Prompt | AI auto-generates personality rules from your conversations |
| 2 | Dead Man's Switch | Emergency actions if you go inactive for N days |
| 3 | Shadow Mode | AI learns your style, replies as you when away |
| 4 | Life Dashboard | Unified health, finance, habits, calendar view |
| 5 | Mood-Aware Response | Detects emotional state, adjusts tone |
| 6 | Encrypted Vault | PBKDF2-200K + AES-256-GCM / HMAC-CTR for API keys |
| 7 | Agent-to-Agent Protocol | HMAC-SHA256 signed communication between instances |
| 8 | A/B Split Response | Two model perspectives on the same question |
| 9 | Time Capsule | Schedule messages to your future self |
| 10 | Thought Stream | Private journaling with hashtag search and mood tracking |
๐ฐ Cost Optimization
SalmAlm is designed to minimize API costs without sacrificing quality:
| Feature | Effect |
|---|---|
| Dynamic tool loading | 67 tools โ 0 (chat) or 7-12 (actions) per request |
| Smart model routing | SimpleโHaiku ($1), ModerateโSonnet ($3), ComplexโOpus ($15) |
| Tool schema compression | 7,749 โ 693 tokens (91% reduction) |
| System prompt compression | 762 โ 310 tokens |
| Intent-based max_tokens | Chat 512, search 1024, code 4096 |
| Intent-based history trim | Chat 10 turns, code 20 turns |
| Cache TTL | Same question cached (30minโ24h, configurable) |
Result: $7.09/day โ $1.23/day (83% savings at 100 calls/day)
๐ Security
SalmAlm follows a dangerous features default OFF policy:
| Feature | Default | Opt-in |
|---|---|---|
| Network bind | 127.0.0.1 (loopback only) |
SALMALM_BIND=0.0.0.0 |
| Shell operators | Blocked | SALMALM_ALLOW_SHELL=1 |
| Home dir file read | Workspace only | SALMALM_ALLOW_HOME_READ=1 |
| Vault fallback | Disabled | SALMALM_VAULT_FALLBACK=1 |
| Plugin system | Disabled | SALMALM_PLUGINS=1 |
| CLI OAuth reuse | Disabled | SALMALM_CLI_OAUTH=1 |
| Elevated exec on external bind | Blocked | SALMALM_ALLOW_ELEVATED=1 |
| Strict CSP (nonce mode) | Disabled | SALMALM_CSP_STRICT=1 to enable |
Tool Risk Tiers
Tools are classified by risk and critical tools are blocked on external bind without authentication:
| Tier | Tools | External (0.0.0.0) |
|---|---|---|
| ๐ด Critical | exec, exec_session, write_file, edit_file, python_eval, sandbox_exec, browser, email_send, gmail, google_calendar, calendar_delete, calendar_add, node_manage, plugin_manage |
Auth required |
| ๐ก High | http_request, read_file, memory_write, mesh, sub_agent, cron_manage, screenshot, tts, stt |
Allowed with warning |
| ๐ข Normal | web_search, weather, translate, etc. |
Allowed |
Security Hardening
- SSRF defense โ DNS pinning + private IP block on every redirect hop (web tools AND browser)
- Browser SSRF โ internal/private URL blocked on external bind
- Irreversible action gate โ
gmail send,calendar delete/createrequire_confirmed=true - Audit log redaction โ secrets scrubbed from tool args before logging (9 pattern types)
- Memory scrubbing โ API keys/tokens auto-redacted before storage
- Path validation โ
Path.is_relative_to()for all file operations (nostartswithbypass) - Write-path gate โ write tools blocked outside allowed roots even for non-existent paths
- Session isolation โ
user_idcolumn in session_store, export scoped to own data - Vault export โ requires admin role
- Secret isolation โ API keys stripped from subprocess environments
- CSRF defense โ Origin validation +
X-Requested-Withcustom header - Centralized auth gate โ all
/api/routes require auth unless in_PUBLIC_PATHS - Node dispatch โ HMAC-SHA256 signed payloads with timestamp + nonce
- 150+ security regression tests in CI
See SECURITY.md for full threat model and details.
๐ฆ Local LLM Setup
SalmAlm works with any OpenAI-compatible local LLM server:
| Server | Default Endpoint | Setup |
|---|---|---|
| Ollama | http://localhost:11434/v1 |
ollama serve then pick model in UI |
| LM Studio | http://localhost:1234/v1 |
Start server in LM Studio |
| vLLM | http://localhost:8000/v1 |
vllm serve <model> |
Settings โ Local LLM โ paste endpoint URL โ Save. API key is optional (only if your server requires auth).
SalmAlm auto-discovers available models via /models, /v1/models, or /api/tags endpoints.
๐ Google OAuth Setup (Gmail & Calendar)
- Google Cloud Console โ Create OAuth client
- Enable Gmail API + Google Calendar API
- Redirect URI:
http://localhost:18800/api/google/callback - Save Client ID + Secret in Settings โ API Keys
- Run
/oauthin chat โ click Google sign-in link
๐ง Configuration
# Server
SALMALM_PORT=18800 # Web server port
SALMALM_BIND=127.0.0.1 # Bind address
SALMALM_HOME=~/SalmAlm # Data directory
# AI
SALMALM_PLANNING=1 # Planning phase (opt-in)
SALMALM_REFLECT=1 # Reflection pass (opt-in)
SALMALM_MAX_TOOL_ITER=25 # Max tool iterations (999=unlimited)
SALMALM_COST_CAP=0 # Daily cost cap (0=unlimited)
# Security
SALMALM_PLUGINS=1 # Enable plugin system
SALMALM_CLI_OAUTH=1 # Allow CLI token reuse
SALMALM_ALLOW_SHELL=1 # Enable shell operators in exec
SALMALM_ALLOW_HOME_READ=1 # File read outside workspace
SALMALM_VAULT_FALLBACK=1 # HMAC-CTR vault without cryptography
All settings also available in the web UI โ Settings panels.
๐๏ธ Architecture
Browser โโWebSocketโโโบ SalmAlm โโโบ Anthropic / OpenAI / Google / xAI / Local LLM
โ โ
โโโHTTP/SSEโโโบ โโโ SQLite (sessions, usage, memory, audit)
โโโ Smart Model Routing (complexity-based)
Telegram โโโบ โโโ Tool Registry (66 tools, risk-tiered)
Discord โโโบ โโโ Security Middleware (auth/CSRF/audit/rate-limit)
โโโ Sub-Agent Manager
Mesh Peers โโโบ โโโ Message Queue (offline + retry)
โโโ Shared Secret Redaction (security/redact.py)
โโโ OS-native Sandbox (bwrap/rlimit)
โโโ Node Gateway (HMAC-signed dispatch)
โโโ Plugin System (opt-in)
โโโ Vault (PBKDF2 + AES-256-GCM / HMAC-CTR)
- 234 modules, 49K+ lines, 82 test files, 1,806 tests
- Pure Python 3.10+ stdlib โ no frameworks, no heavy dependencies
- Data stored under
~/SalmAlm(configurable viaSALMALM_HOME)
๐ Plugins
โ ๏ธ Plugins run arbitrary code. Enable with
SALMALM_PLUGINS=1.
Drop a .py file in ~/SalmAlm/plugins/:
# plugins/my_plugin.py
TOOLS = [{
'name': 'my_tool',
'description': 'Says hello',
'input_schema': {'type': 'object', 'properties': {'name': {'type': 'string'}}}
}]
def handle_my_tool(args):
return f"Hello, {args.get('name', 'world')}!"
๐ค Contributing
See CONTRIBUTING.md.
git clone https://github.com/hyunjun6928-netizen/salmalm.git
cd salmalm
pip install -e ".[dev]"
for f in tests/test_*.py; do python -m pytest "$f" -q --timeout=30; done
๐ License
SalmAlm = ์ถ(Life) + ์(Knowledge)
Your life, understood by AI.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file salmalm-0.19.1.tar.gz.
File metadata
- Download URL: salmalm-0.19.1.tar.gz
- Upload date:
- Size: 794.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
204c6479635812f73a7d6916350cc6d1e40fdeec36aced0474e1913febd43bd8
|
|
| MD5 |
9ef06d98bdf98d61e62dc54c5a5b2804
|
|
| BLAKE2b-256 |
a65b0546bb4d9d49724bb68f3bbee3f1285422800a6db8204156d983bc8fe199
|
File details
Details for the file salmalm-0.19.1-py3-none-any.whl.
File metadata
- Download URL: salmalm-0.19.1-py3-none-any.whl
- Upload date:
- Size: 796.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
300fe885ce1a4a830c6174857d6b102df8419118b7343f94bb10ab1b19e9313b
|
|
| MD5 |
77cf1f26cd482613c08d8d452f39c7ed
|
|
| BLAKE2b-256 |
b8d89b57fb8c96828bef394ceb38c8085500bc38c49f93ba5b9d4ec7fb49caa2
|