Multi-node AI orchestration platform with tool use, agent routing, and cluster simulation.
Project description
Turnstone
Multi-node AI orchestration platform. Deploy tool-using AI agents across a cluster of servers with direct HTTP routing, interactive interfaces, and enterprise governance.
Named after the Ruddy Turnstone (Arenaria interpres) — a shorebird that flips stones to discover what's hiding underneath.
Release Tracks
| Track | Install | Docker | Description |
|---|---|---|---|
| Stable | pip install turnstone |
ghcr.io/turnstonelabs/turnstone:stable |
Production-grade. Bugfixes only. |
| Experimental | pip install turnstone --pre |
ghcr.io/turnstonelabs/turnstone:experimental |
New features. May have rough edges. |
See docs/releasing.md for the full release process.
What it does
Turnstone gives LLMs tools — shell, files, search, web, planning — and orchestrates multi-turn conversations where the model investigates, acts, and reports.
- Interactive sessions — terminal CLI or browser UI with parallel workstreams
- Cluster dashboard — real-time view of all nodes and workstreams with console routing proxy
- Intent validation — LLM judge evaluates every tool call with risk assessments and evidence
- Governance — RBAC, OIDC SSO, tool policies, skills, usage tracking, audit logs
- Multi-provider — OpenAI-compatible APIs (vLLM, llama.cpp, NIM), Anthropic Messages API, and Google Gemini
- MCP support — external tool servers with native deferred loading (Anthropic/OpenAI) or BM25 fallback
Quickstart
pip install turnstone
# Terminal REPL
turnstone --base-url http://localhost:8000/v1
# Browser UI
turnstone-server --port 8080 --base-url http://localhost:8000/v1
# Cluster dashboard
pip install turnstone[console]
turnstone-console --port 8090
Docker
cp .env.example .env # edit LLM_BASE_URL, OPENAI_API_KEY, etc.
docker compose --profile production up
See QUICKSTART.md for the bootstrap wizard and docs/docker.md for Docker configuration and profiles.
Programmatic (SDK)
from turnstone.sdk import TurnstoneServer
with TurnstoneServer("http://localhost:8080", token="tok_xxx") as client:
ws = client.create_workstream(name="demo")
result = client.send_and_wait("Analyze the error logs", ws.ws_id, auto_approve=True)
print(result.content)
Tools
Built-in tools for shell, files, search, web, memory, notifications, and autonomous sub-agents — plus external tools via MCP with native deferred loading. See docs/tools.md for the full reference and docs/mcp.md for MCP configuration.
Architecture
Single-node: Client → Server (direct HTTP + SSE). No external dependencies beyond the database.
Multi-node: Client → Console (hash ring routing proxy) → Server nodes. The console maintains a 65536-entry bucket cache for O(1) workstream routing. A rebalancer daemon redistributes buckets when nodes join or leave.
| Component | Purpose |
|---|---|
turnstone |
Terminal CLI (REPL) |
turnstone-server |
Web UI + REST API + SSE events |
turnstone-console |
Cluster dashboard + routing proxy + admin panel |
turnstone-channel |
Channel gateway (Discord, with adapters for Slack/Teams planned) |
turnstone-admin |
User/token management CLI |
turnstone-eval |
Eval harness for prompt/tool optimization |
turnstone-bootstrap |
LLM-guided setup wizard |
Diagrams
UML diagrams in docs/diagrams/:
| Diagram | Description |
|---|---|
| System Context | Components and external dependencies |
| Package Structure | Python modules and dependency graph |
| Core Engine | SessionUI, ChatSession, LLMProvider |
| Conversation Turn | Message lifecycle through the engine |
| Tool Pipeline | Prepare / approve / execute |
| Workstream States | State machine transitions |
| Console Data Flow | Dashboard data collection |
| Deployment | Docker Compose topology |
| Auth | JWT, scopes, login flows |
| Channels | Discord adapter + routing |
| Judge | Intent validation pipeline |
| OIDC | SSO authorization code flow |
Documentation
| Topic | Link |
|---|---|
| Configuration reference | docs/settings.md |
| API reference | docs/api-reference.md |
| Docker deployment | docs/docker.md |
| Intent validation (judge) | docs/judge.md |
| Governance & RBAC | docs/governance.md |
| OIDC SSO | docs/oidc.md |
| TLS / mTLS | docs/tls.md |
| Channel integrations | docs/channels.md |
| Console dashboard | docs/console.md |
| Eval harness | docs/eval.md |
| Tools reference | docs/tools.md |
| MCP integration | docs/mcp.md |
Requirements
- Python 3.11+
- An OpenAI-compatible API endpoint, Anthropic API key, or Google Gemini API key
- Optional: PostgreSQL (
pip install turnstone[postgres]), Anthropic (pip install turnstone[anthropic]) - Git LFS for cloning (diagram PNGs)
License
Business Source License 1.1 — free for all use except hosting as a managed service. Converts to Apache 2.0 on 2030-03-01.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file turnstone-1.3.0a2.tar.gz.
File metadata
- Download URL: turnstone-1.3.0a2.tar.gz
- Upload date:
- Size: 3.4 MB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b42f25ae42fcef5cbf89e0b211e7e8eff91b9b7f14f5dc37bd328566d90d215d
|
|
| MD5 |
adf363b53929debc63caf852f25f4104
|
|
| BLAKE2b-256 |
48ee6482204d7b07a1b487eae472a9426c55f51851a83a571a8b908c855ecfc4
|
Provenance
The following attestation bundles were made for turnstone-1.3.0a2.tar.gz:
Publisher:
publish.yml on turnstonelabs/turnstone
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
turnstone-1.3.0a2.tar.gz -
Subject digest:
b42f25ae42fcef5cbf89e0b211e7e8eff91b9b7f14f5dc37bd328566d90d215d - Sigstore transparency entry: 1259146874
- Sigstore integration time:
-
Permalink:
turnstonelabs/turnstone@83cfea36b0ac7672446b77062dc54e9c40a5d570 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/turnstonelabs
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@83cfea36b0ac7672446b77062dc54e9c40a5d570 -
Trigger Event:
workflow_run
-
Statement type:
File details
Details for the file turnstone-1.3.0a2-py3-none-any.whl.
File metadata
- Download URL: turnstone-1.3.0a2-py3-none-any.whl
- Upload date:
- Size: 2.8 MB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
0fa0924630d30f98bcc6a7d05755c791a6d56bf3bf71cee45bce0f92e83a7522
|
|
| MD5 |
cfc87d7066ba454f344e1ae0020445a7
|
|
| BLAKE2b-256 |
1cfa7d8aedbcc7cd52989ff79b872921623f9adc3343071537b34a48a6e18dff
|
Provenance
The following attestation bundles were made for turnstone-1.3.0a2-py3-none-any.whl:
Publisher:
publish.yml on turnstonelabs/turnstone
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
turnstone-1.3.0a2-py3-none-any.whl -
Subject digest:
0fa0924630d30f98bcc6a7d05755c791a6d56bf3bf71cee45bce0f92e83a7522 - Sigstore transparency entry: 1259146913
- Sigstore integration time:
-
Permalink:
turnstonelabs/turnstone@83cfea36b0ac7672446b77062dc54e9c40a5d570 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/turnstonelabs
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@83cfea36b0ac7672446b77062dc54e9c40a5d570 -
Trigger Event:
workflow_run
-
Statement type: