A pluggable agent toolkit (server + TUI + web UI)
Project description
Lattis
Run AI agents on a server, interact from anywhere.
I built an agent I liked (Binsmith) and wanted to use it from my laptop, my phone, wherever. Lattis is what emerged: a server that hosts agents with a TUI for terminals and a web UI for browsers.
I run it on a Linux box in my Tailscale network. Start a conversation on my laptop, pick it up on my phone, everything stays in sync.
Quick start
# Run the TUI (starts a local server automatically)
uvx lattis
# Or run the server explicitly, then connect from anywhere
uvx lattis server
Open http://localhost:8000 for the web UI, or run uvx lattis from another machine pointing at your server.
What you get
- Server + clients: FastAPI backend, Textual TUI, bundled web UI
- Persistent conversations: threads stored in SQLite, survive restarts
- Pluggable agents: different agents per thread, swap anytime
- Works anywhere: if it can hit HTTP, it can use your agents
The setup I use
┌─────────────────────────────────────────────────────────────┐
│ Linux server (Tailscale) │
│ lattis server │
│ :8000 │
└─────────────────────────────────────────────────────────────┘
▲ ▲ ▲
│ │ │
┌────┴────┐ ┌─────┴─────┐ ┌────┴────┐
│ laptop │ │ phone │ │ tablet │
│ TUI │ │ web UI │ │ web UI │
└─────────┘ └───────────┘ └─────────┘
Threads persist on the server. I can start something on my laptop, continue on my phone, come back to it days later.
Agents
Lattis discovers agents automatically from:
- Built-ins:
assistant,poetry(included with Lattis) - Entry points: any installed package that registers with
lattis.agents - Explicit specs:
module:attrpaths via--agentsorAGENT_PLUGINS
Binsmith is what started all this. Now it's just another plugin:
uv pip install binsmith
uvx lattis --agent binsmith
Building your own agent
The simplest case - just export a pydantic-ai Agent:
# my_agent.py
from pydantic_ai import Agent
plugin = Agent("google-gla:gemini-2.0-flash", system_prompt="You are helpful.")
uvx lattis --agents my_agent:plugin
For more control (custom dependencies, lifecycle hooks), use the plugin API:
from pydantic_ai import Agent
from lattis.plugins import AgentPlugin, AgentRunContext
def create_agent(model: str) -> Agent:
return Agent(model, system_prompt="...")
def create_deps(ctx: AgentRunContext):
# Access ctx.workspace, ctx.project_root, ctx.session_id, etc.
return MyDeps(...)
plugin = AgentPlugin(
id="my-agent",
name="My Agent",
create_agent=create_agent,
create_deps=create_deps,
)
Register via entry point in pyproject.toml:
[project.entry-points."lattis.agents"]
my-agent = "my_package:plugin"
CLI
lattis # TUI (default)
lattis tui # TUI explicitly
lattis server # API server + web UI
TUI options
--server <url> Connect to a remote server
--local Skip server discovery, run in-process
--agent <id> Default agent (local mode)
--agents <specs> Extra plugins to load (local mode)
The TUI auto-discovers a server on localhost:8000 if one is running for the same project. Use --server to point at a remote machine.
Server options
--host <host> Interface to bind (default: 127.0.0.1)
--port <port> Port (default: 8000)
--reload Auto-reload on code changes
--agent <id> Default agent
--agents <specs> Extra plugins to load
TUI commands
/help Show help
/threads List threads
/thread <id> Switch to thread (creates if needed)
/thread new [id] Create new thread
/thread delete <id> Delete thread
/clear Clear current thread
/agent Show current agent
/agent list [filter] List available agents
/agent set <id> Switch agent for this thread
/model Show current model
/model list [filter] List available models
/model set <name> Switch model
/quit Exit
Storage
.lattis/
lattis.db # SQLite - threads, messages, state
session_id # Persistent session identifier
workspace/ # Shared directory for agents that need it
Configuration
| Variable | Default | Description |
|---|---|---|
AGENT_DEFAULT |
assistant |
Default agent |
AGENT_PLUGINS |
Extra plugins (comma-separated module:attr) |
|
LATTIS_SERVER_URL |
Server URL for remote connections | |
LATTIS_PROJECT_ROOT |
cwd | Project root for storage |
LATTIS_DATA_DIR |
.lattis |
Data directory |
LATTIS_WORKSPACE_DIR |
Override workspace location |
Requirements
- Python 3.12+
- uv (recommended)
- An API key for at least one model provider (Gemini, Anthropic, OpenAI)
export GEMINI_API_KEY=... # Google
export ANTHROPIC_API_KEY=... # Anthropic
export OPENAI_API_KEY=... # OpenAI
Web UI development
The web UI is bundled from frontend/. To rebuild:
cd frontend
npm install
npm run build
Static files are served from lattis/web/static when present.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file lattis-0.6.1-py3-none-any.whl.
File metadata
- Download URL: lattis-0.6.1-py3-none-any.whl
- Upload date:
- Size: 212.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: uv/0.9.22 {"installer":{"name":"uv","version":"0.9.22","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
75f13e3df1fa982f90b5fd867fefebb4535ffb8ca82b565bae7dec9784f04d9a
|
|
| MD5 |
3229a6114c0dc93160c9922ce987fd10
|
|
| BLAKE2b-256 |
db07d2b115b87d2a35172e0feec092aac04dd4defa0347325f66def511041d26
|