Skip to main content

Multi-node AI orchestration platform with tool use, agent routing, and cluster simulation.

Project description

Turnstone

CI PyPI Python License

Multi-node AI orchestration platform. Deploy tool-using AI agents across a cluster of servers with direct HTTP routing, interactive interfaces, and enterprise governance.

Turnstone console — multi-workstream AI orchestration with mermaid diagrams

Named after the Ruddy Turnstone (Arenaria interpres) — a shorebird that flips stones to discover what's hiding underneath.

Release Tracks

Track Install Docker Description
Stable pip install turnstone ghcr.io/turnstonelabs/turnstone:stable Production-grade. Bugfixes only.
Experimental pip install turnstone --pre ghcr.io/turnstonelabs/turnstone:experimental New features. May have rough edges.

See docs/releasing.md for the full release process.

What it does

Turnstone gives LLMs tools — shell, files, search, web, planning — and orchestrates multi-turn conversations where the model investigates, acts, and reports.

  • Interactive sessions — terminal CLI or browser UI with parallel workstreams
  • Cluster dashboard — real-time view of all nodes and workstreams with console routing proxy
  • Intent validation — LLM judge evaluates every tool call with risk assessments and evidence
  • Governance — RBAC, OIDC SSO, tool policies, skills, usage tracking, audit logs
  • Multi-provider — OpenAI-compatible APIs (vLLM, llama.cpp, NIM), Anthropic Messages API, and Google Gemini
  • MCP support — external tool servers with native deferred loading (Anthropic/OpenAI) or BM25 fallback

Turnstone system architecture

Quickstart

pip install turnstone

# Terminal REPL
turnstone --base-url http://localhost:8000/v1

# Browser UI
turnstone-server --port 8080 --base-url http://localhost:8000/v1

# Cluster dashboard
pip install turnstone[console]
turnstone-console --port 8090

Docker

cp .env.example .env  # edit LLM_BASE_URL, OPENAI_API_KEY, etc.
docker compose --profile production up

See QUICKSTART.md for the bootstrap wizard and docs/docker.md for Docker configuration and profiles.

Programmatic (SDK)

from turnstone.sdk import TurnstoneServer

with TurnstoneServer("http://localhost:8080", token="tok_xxx") as client:
    ws = client.create_workstream(name="demo")
    result = client.send_and_wait("Analyze the error logs", ws.ws_id, auto_approve=True)
    print(result.content)

Tools

Built-in tools for shell, files, search, web, memory, notifications, and autonomous sub-agents — plus external tools via MCP with native deferred loading. See docs/tools.md for the full reference and docs/mcp.md for MCP configuration.

Architecture

Single-node: Client → Server (direct HTTP + SSE). No external dependencies beyond the database.

Multi-node: Client → Console (hash ring routing proxy) → Server nodes. The console maintains a 65536-entry bucket cache for O(1) workstream routing. A rebalancer daemon redistributes buckets when nodes join or leave.

Component Purpose
turnstone Terminal CLI (REPL)
turnstone-server Web UI + REST API + SSE events
turnstone-console Cluster dashboard + routing proxy + admin panel
turnstone-channel Channel gateway (Discord, with adapters for Slack/Teams planned)
turnstone-admin User/token management CLI
turnstone-eval Eval harness for prompt/tool optimization
turnstone-bootstrap LLM-guided setup wizard

Diagrams

UML diagrams in docs/diagrams/:

Diagram Description
System Context Components and external dependencies
Package Structure Python modules and dependency graph
Core Engine SessionUI, ChatSession, LLMProvider
Conversation Turn Message lifecycle through the engine
Tool Pipeline Prepare / approve / execute
Workstream States State machine transitions
Console Data Flow Dashboard data collection
Deployment Docker Compose topology
Auth JWT, scopes, login flows
Channels Discord adapter + routing
Judge Intent validation pipeline
OIDC SSO authorization code flow

Documentation

Topic Link
Configuration reference docs/settings.md
API reference docs/api-reference.md
Docker deployment docs/docker.md
Intent validation (judge) docs/judge.md
Governance & RBAC docs/governance.md
OIDC SSO docs/oidc.md
TLS / mTLS docs/tls.md
Channel integrations docs/channels.md
Console dashboard docs/console.md
Eval harness docs/eval.md
Tools reference docs/tools.md
MCP integration docs/mcp.md

Requirements

  • Python 3.11+
  • An OpenAI-compatible API endpoint, Anthropic API key, or Google Gemini API key
  • Optional: PostgreSQL (pip install turnstone[postgres]), Anthropic (pip install turnstone[anthropic])
  • Git LFS for cloning (diagram PNGs)

License

Business Source License 1.1 — free for all use except hosting as a managed service. Converts to Apache 2.0 on 2030-03-01.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

turnstone-1.3.0a3.tar.gz (3.4 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

turnstone-1.3.0a3-py3-none-any.whl (2.8 MB view details)

Uploaded Python 3

File details

Details for the file turnstone-1.3.0a3.tar.gz.

File metadata

  • Download URL: turnstone-1.3.0a3.tar.gz
  • Upload date:
  • Size: 3.4 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.13

File hashes

Hashes for turnstone-1.3.0a3.tar.gz
Algorithm Hash digest
SHA256 474a51d3bf7faf9960d0ca1ee732b2cab4bf0fb32aecad57a51642b3fb7a25b0
MD5 ecc749e72c3328075181592c2dbb11a3
BLAKE2b-256 9dbaaae22e1e72b9cd57ecb1d9f594a3951bfaa5a963407f6b8ead96594fc025

See more details on using hashes here.

Provenance

The following attestation bundles were made for turnstone-1.3.0a3.tar.gz:

Publisher: publish.yml on turnstonelabs/turnstone

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file turnstone-1.3.0a3-py3-none-any.whl.

File metadata

  • Download URL: turnstone-1.3.0a3-py3-none-any.whl
  • Upload date:
  • Size: 2.8 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.13

File hashes

Hashes for turnstone-1.3.0a3-py3-none-any.whl
Algorithm Hash digest
SHA256 4fe0e1df45f224e84ace7c75231487863a13051f86273ba15b26654f19cf3693
MD5 8ed65cf2ac0494185c58e8161f99de91
BLAKE2b-256 d4c2ee6d8f4c348e6493ead5bfd59b19b6d76e0ed1454fae4064487de1582225

See more details on using hashes here.

Provenance

The following attestation bundles were made for turnstone-1.3.0a3-py3-none-any.whl:

Publisher: publish.yml on turnstonelabs/turnstone

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page