Skip to main content

Personal AI Gateway โ€” Pure Python, zero-dependency AI assistant with multi-model routing, web UI, and 56+ built-in tools

Project description

๐Ÿ˜ˆ SalmAlm (์‚ถ์•Ž)

Your Entire AI Life in One pip install

PyPI Python License: MIT CI Tests Tools

ํ•œ๊ตญ์–ด README


What is SalmAlm?

SalmAlm is a personal AI gateway โ€” one Python package that gives you a full-featured AI assistant with a web UI, Telegram/Discord bots, 67 tools, and 10 features you won't find anywhere else.

No Docker. No Node.js. No config files. Just:

pip install salmalm
salmalm
# โ†’ http://localhost:18800

First launch opens a Setup Wizard โ€” paste an API key, pick a model, done.

โš ๏ธ Don't run salmalm from inside a cloned repo directory โ€” Python will import the local source instead of the installed package. Run from ~ or any other directory.


Why SalmAlm?

Feature SalmAlm ChatGPT OpenClaw Open WebUI
๐Ÿ”ง Install complexity pip install N/A npm + config Docker
๐Ÿค– Multi-provider routing โœ… โŒ โœ… โœ…
๐Ÿง  Self-Evolving Prompt โœ… โŒ โŒ โŒ
๐Ÿ‘ป Shadow Mode โœ… โŒ โŒ โŒ
๐Ÿ’€ Dead Man's Switch โœ… โŒ โŒ โŒ
๐Ÿ” Encrypted Vault โœ… โŒ โŒ โŒ
๐Ÿ“ฑ Telegram + Discord โœ… โŒ โœ… โŒ
๐Ÿงฉ MCP Marketplace โœ… โŒ โŒ โœ…
๐Ÿฆ™ Local LLM (Ollama/LM Studio/vLLM) โœ… โŒ โœ… โœ…
๐Ÿ“ฆ Zero dependencies* โœ… N/A โŒ โŒ

*stdlib-only core; optional cryptography for AES-256-GCM vault, otherwise pure Python HMAC-CTR fallback


โšก Quick Start

# One-liner install
pip install salmalm

# Start (web UI at http://localhost:18800)
salmalm

# Auto-open browser
salmalm --open

# Desktop shortcut (double-click to launch!)
salmalm --shortcut

# Self-update
salmalm --update

# Custom port / external access
SALMALM_PORT=8080 salmalm
SALMALM_BIND=0.0.0.0 salmalm    # expose to LAN (see Security section)

Supported Providers

Provider Models Setup
Anthropic Claude Opus 4, Sonnet 4, Haiku 4.5 Web UI โ†’ Settings โ†’ API Keys
OpenAI GPT-5.2, GPT-4.1, o3, o4-mini Web UI โ†’ Settings โ†’ API Keys
Google Gemini 3 Pro/Flash, 2.5 Pro/Flash Web UI โ†’ Settings โ†’ API Keys
xAI Grok-4, Grok-3 Web UI โ†’ Settings โ†’ API Keys
Local LLM Ollama / LM Studio / vLLM Web UI โ†’ Settings โ†’ Local LLM

Local LLM endpoints: Ollama localhost:11434/v1 ยท LM Studio localhost:1234/v1 ยท vLLM localhost:8000/v1


๐ŸŽฏ Feature Overview

Core AI

  • Smart model routing โ€” auto-selects by complexity (simpleโ†’Haiku, moderateโ†’Sonnet, complexโ†’Opus)
  • Extended Thinking โ€” deep reasoning with budget control
  • 5-stage context compaction โ€” strip binary โ†’ trim tools โ†’ drop old โ†’ truncate โ†’ LLM summarize
  • Prompt caching โ€” Anthropic cache_control for 90% cost reduction
  • Model failover โ€” exponential backoff + retry across providers
  • Sub-agent system โ€” spawn/steer/collect background AI workers
  • Infinite loop detection โ€” 3+ same (tool, args_hash) in last 6 iterations = auto-break
  • Irreversible action gate โ€” email send, calendar delete require explicit confirmation

67 Built-in Tools

Web search (Brave), email (Gmail), calendar (Google), file I/O, shell exec, Python eval, image generation (DALL-E/Aurora), TTS/STT, browser automation (Playwright), RAG search, QR codes, system monitor, OS-native sandbox, mesh networking, canvas preview, and more.

Web UI

  • Real-time streaming (WebSocket + SSE fallback)
  • Session branching, rollback, search (Ctrl+K), command palette (Ctrl+Shift+P)
  • Dark/Light themes, EN/KR i18n (language toggle in settings)
  • Image paste/drag-drop with vision, code syntax highlighting
  • PWA installable, CSP-compatible (all JS in external app.js)

Channels

  • Web โ€” full SPA at localhost:18800
  • Telegram โ€” polling + webhook with inline buttons
  • Discord โ€” bot with thread support and mentions

Admin Panels

๐Ÿ“ˆ Dashboard ยท ๐Ÿ“‹ Sessions ยท โฐ Cron Jobs ยท ๐Ÿง  Memory ยท ๐Ÿ”ฌ Debug ยท ๐Ÿ“‹ Logs ยท ๐Ÿ“– Docs


โœจ 10 Unique Features

# Feature What it does
1 Self-Evolving Prompt AI auto-generates personality rules from your conversations
2 Dead Man's Switch Emergency actions if you go inactive for N days
3 Shadow Mode AI learns your style, replies as you when away
4 Life Dashboard Unified health, finance, habits, calendar view
5 Mood-Aware Response Detects emotional state, adjusts tone
6 Encrypted Vault PBKDF2-200K + AES-256-GCM / HMAC-CTR for API keys
7 Agent-to-Agent Protocol HMAC-SHA256 signed communication between instances
8 A/B Split Response Two model perspectives on the same question
9 Time Capsule Schedule messages to your future self
10 Thought Stream Private journaling with hashtag search and mood tracking

๐Ÿ’ฐ Cost Optimization

SalmAlm is designed to minimize API costs without sacrificing quality:

Feature Effect
Dynamic tool loading 67 tools โ†’ 0 (chat) or 7-12 (actions) per request
Smart model routing Simpleโ†’Haiku ($1), Moderateโ†’Sonnet ($3), Complexโ†’Opus ($15)
Tool schema compression 7,749 โ†’ 693 tokens (91% reduction)
System prompt compression 762 โ†’ 310 tokens
Intent-based max_tokens Chat 512, search 1024, code 4096
Intent-based history trim Chat 10 turns, code 20 turns
Cache TTL Same question cached (30minโ€“24h, configurable)

Result: $7.09/day โ†’ $1.23/day (83% savings at 100 calls/day)


๐Ÿ”’ Security

SalmAlm follows a dangerous features default OFF policy:

Feature Default Opt-in
Network bind 127.0.0.1 (loopback only) SALMALM_BIND=0.0.0.0
Shell operators Blocked SALMALM_ALLOW_SHELL=1
Home dir file read Workspace only SALMALM_ALLOW_HOME_READ=1
Vault fallback Disabled SALMALM_VAULT_FALLBACK=1
Plugin system Disabled SALMALM_PLUGINS=1
CLI OAuth reuse Disabled SALMALM_CLI_OAUTH=1
Elevated exec on external bind Blocked SALMALM_ALLOW_ELEVATED=1
Strict CSP (nonce mode) Disabled SALMALM_CSP_STRICT=1 to enable

Tool Risk Tiers

Tools are classified by risk and critical tools are blocked on external bind without authentication:

Tier Tools External (0.0.0.0)
๐Ÿ”ด Critical exec, exec_session, write_file, edit_file, python_eval, sandbox_exec, browser, email_send, gmail, google_calendar, calendar_delete, calendar_add, node_manage, plugin_manage Auth required
๐ŸŸก High http_request, read_file, memory_write, mesh, sub_agent, cron_manage, screenshot, tts, stt Allowed with warning
๐ŸŸข Normal web_search, weather, translate, etc. Allowed

Security Hardening

  • SSRF defense โ€” DNS pinning + private IP block on every redirect hop (web tools AND browser)
  • Browser SSRF โ€” internal/private URL blocked on external bind
  • Irreversible action gate โ€” gmail send, calendar delete/create require _confirmed=true
  • Audit log redaction โ€” secrets scrubbed from tool args before logging (9 pattern types)
  • Memory scrubbing โ€” API keys/tokens auto-redacted before storage
  • Path validation โ€” Path.is_relative_to() for all file operations (no startswith bypass)
  • Write-path gate โ€” write tools blocked outside allowed roots even for non-existent paths
  • Session isolation โ€” user_id column in session_store, export scoped to own data
  • Vault export โ€” requires admin role
  • Secret isolation โ€” API keys stripped from subprocess environments
  • CSRF defense โ€” Origin validation + X-Requested-With custom header
  • Centralized auth gate โ€” all /api/ routes require auth unless in _PUBLIC_PATHS
  • Node dispatch โ€” HMAC-SHA256 signed payloads with timestamp + nonce
  • 150+ security regression tests in CI

See SECURITY.md for full threat model and details.


๐Ÿฆ™ Local LLM Setup

SalmAlm works with any OpenAI-compatible local LLM server:

Server Default Endpoint Setup
Ollama http://localhost:11434/v1 ollama serve then pick model in UI
LM Studio http://localhost:1234/v1 Start server in LM Studio
vLLM http://localhost:8000/v1 vllm serve <model>

Settings โ†’ Local LLM โ†’ paste endpoint URL โ†’ Save. API key is optional (only if your server requires auth).

SalmAlm auto-discovers available models via /models, /v1/models, or /api/tags endpoints.


๐Ÿ”‘ Google OAuth Setup (Gmail & Calendar)

  1. Google Cloud Console โ†’ Create OAuth client
  2. Enable Gmail API + Google Calendar API
  3. Redirect URI: http://localhost:18800/api/google/callback
  4. Save Client ID + Secret in Settings โ†’ API Keys
  5. Run /oauth in chat โ†’ click Google sign-in link

๐Ÿ”ง Configuration

# Server
SALMALM_PORT=18800         # Web server port
SALMALM_BIND=127.0.0.1    # Bind address
SALMALM_HOME=~/SalmAlm    # Data directory

# AI
SALMALM_PLANNING=1         # Planning phase (opt-in)
SALMALM_REFLECT=1          # Reflection pass (opt-in)
SALMALM_MAX_TOOL_ITER=25   # Max tool iterations (999=unlimited)
SALMALM_COST_CAP=0         # Daily cost cap (0=unlimited)

# Security
SALMALM_PLUGINS=1           # Enable plugin system
SALMALM_CLI_OAUTH=1         # Allow CLI token reuse
SALMALM_ALLOW_SHELL=1       # Enable shell operators in exec
SALMALM_ALLOW_HOME_READ=1   # File read outside workspace
SALMALM_VAULT_FALLBACK=1    # HMAC-CTR vault without cryptography

All settings also available in the web UI โ†’ Settings panels.


๐Ÿ—๏ธ Architecture

Browser โ”€โ”€WebSocketโ”€โ”€โ–บ SalmAlm โ”€โ”€โ–บ Anthropic / OpenAI / Google / xAI / Local LLM
   โ”‚                     โ”‚
   โ””โ”€โ”€HTTP/SSEโ”€โ”€โ–บ       โ”œโ”€โ”€ SQLite (sessions, usage, memory, audit)
                         โ”œโ”€โ”€ Smart Model Routing (complexity-based)
Telegram โ”€โ”€โ–บ             โ”œโ”€โ”€ Tool Registry (66 tools, risk-tiered)
Discord  โ”€โ”€โ–บ             โ”œโ”€โ”€ Security Middleware (auth/CSRF/audit/rate-limit)
                         โ”œโ”€โ”€ Sub-Agent Manager
Mesh Peers โ”€โ”€โ–บ           โ”œโ”€โ”€ Message Queue (offline + retry)
                         โ”œโ”€โ”€ Shared Secret Redaction (security/redact.py)
                         โ”œโ”€โ”€ OS-native Sandbox (bwrap/rlimit)
                         โ”œโ”€โ”€ Node Gateway (HMAC-signed dispatch)
                         โ”œโ”€โ”€ Plugin System (opt-in)
                         โ””โ”€โ”€ Vault (PBKDF2 + AES-256-GCM / HMAC-CTR)
  • 234 modules, 49K+ lines, 82 test files, 1,806 tests
  • Pure Python 3.10+ stdlib โ€” no frameworks, no heavy dependencies
  • Data stored under ~/SalmAlm (configurable via SALMALM_HOME)

๐Ÿ”Œ Plugins

โš ๏ธ Plugins run arbitrary code. Enable with SALMALM_PLUGINS=1.

Drop a .py file in ~/SalmAlm/plugins/:

# plugins/my_plugin.py
TOOLS = [{
    'name': 'my_tool',
    'description': 'Says hello',
    'input_schema': {'type': 'object', 'properties': {'name': {'type': 'string'}}}
}]

def handle_my_tool(args):
    return f"Hello, {args.get('name', 'world')}!"

๐Ÿค Contributing

See CONTRIBUTING.md.

git clone https://github.com/hyunjun6928-netizen/salmalm.git
cd salmalm
pip install -e ".[dev]"
for f in tests/test_*.py; do python -m pytest "$f" -q --timeout=30; done

๐Ÿ“„ License

MIT


SalmAlm = ์‚ถ(Life) + ์•Ž(Knowledge)

Your life, understood by AI.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

salmalm-0.19.1.tar.gz (794.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

salmalm-0.19.1-py3-none-any.whl (796.3 kB view details)

Uploaded Python 3

File details

Details for the file salmalm-0.19.1.tar.gz.

File metadata

  • Download URL: salmalm-0.19.1.tar.gz
  • Upload date:
  • Size: 794.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for salmalm-0.19.1.tar.gz
Algorithm Hash digest
SHA256 204c6479635812f73a7d6916350cc6d1e40fdeec36aced0474e1913febd43bd8
MD5 9ef06d98bdf98d61e62dc54c5a5b2804
BLAKE2b-256 a65b0546bb4d9d49724bb68f3bbee3f1285422800a6db8204156d983bc8fe199

See more details on using hashes here.

File details

Details for the file salmalm-0.19.1-py3-none-any.whl.

File metadata

  • Download URL: salmalm-0.19.1-py3-none-any.whl
  • Upload date:
  • Size: 796.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for salmalm-0.19.1-py3-none-any.whl
Algorithm Hash digest
SHA256 300fe885ce1a4a830c6174857d6b102df8419118b7343f94bb10ab1b19e9313b
MD5 77cf1f26cd482613c08d8d452f39c7ed
BLAKE2b-256 b8d89b57fb8c96828bef394ceb38c8085500bc38c49f93ba5b9d4ec7fb49caa2

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page