Real-time 3D visualization for multi-agent AI systems
Project description
agentviz
Real-time 3D visualization for multi-agent AI systems. Drop one decorator on your agent functions and watch them appear as robots in a live 3D scene — calls, responses, token streams, errors, and latency all rendered as they happen.
pip install agentviz
agentviz serve
How it works
- You run
agentviz serve— starts a local FastAPI server + opens the 3D UI in your browser - You decorate your agent functions with
@agentviz.trace - Every call, response, error, and token stream is sent to the server and rendered live
The server can be self-hosted anywhere (Linode, Railway, Docker). Multiple agents on different machines can all connect to the same room.
Quick start
Simplest — one decorator
import agentviz
agentviz.init(server="http://localhost:8000")
@agentviz.trace
async def fetch_data(query: str) -> str:
return await db.query(query)
@agentviz.trace(name="Planner", to="orchestrator", color="#9B59B6")
async def plan(goal: str) -> str:
return await llm.plan(goal)
That's it. Run your agents — they show up in the UI automatically.
Environment variables (zero code changes)
export AGENTVIZ_SERVER=http://localhost:8000
export AGENTVIZ_PROJECT=my-team
Then just use @agentviz.trace with no init() call.
WebSocket SDK (full control)
from agentviz import AgentVizClient
async with AgentVizClient(
server="ws://localhost:8000/agent-ws",
name="DataFetcher",
color="#E74C3C",
) as client:
call_id = await client.emit_call(to="orchestrator", message="Fetching records…")
result = await do_work()
await client.emit_response(to="orchestrator", call_id=call_id, result=result)
HTTP client (serverless / AWS Lambda / Cloud Run)
from agentviz import HttpAgentVizClient
client = HttpAgentVizClient(server="https://my-agentviz.railway.app", name="Lambda")
call_id = client.emit_call(to="orchestrator", message="Processing event…")
result = process(event)
client.emit_response(to="orchestrator", call_id=call_id, result=result)
Token streaming
async for chunk in llm.stream(prompt):
await client.emit_token(chunk.delta)
await client.emit_stream_end()
Tokens accumulate in a speech bubble above the robot in real-time.
Features
@agentviz.trace— works on anyasyncorsyncfunction, no boilerplate- Trace trees — nested calls automatically build a parent→child hierarchy (via
contextvars) - Token streaming — live speech bubble above each robot as the LLM generates
- Error visualization — red glow + shake animation on agent errors
- Latency labels — floating ms labels between agents, color-coded by speed
- Dynamic agents — robots spawn and despawn as agents connect and disconnect
- 5 layouts —
semicircle,pipeline,star,mesh,grid— switch live from the UI - Room isolation —
?room=project-nameseparates teams on the same server - Session recording — SQLite-backed, replay any past session from the UI
- No orchestrator required — works for peer-to-peer autonomous agent systems
Integrations
LangChain
from agentviz.integrations.langchain import AgentVizCallbackHandler
from agentviz import HttpAgentVizClient
client = HttpAgentVizClient(server="http://localhost:8000", name="LangChainAgent")
handler = AgentVizCallbackHandler(client)
chain.invoke({"input": "..."}, config={"callbacks": [handler]})
LangGraph
from agentviz.integrations.langgraph import get_langgraph_callbacks
callbacks = get_langgraph_callbacks(client)
graph.invoke(state, config={"callbacks": callbacks})
OpenAI Agents SDK
from agentviz.integrations.openai_agents import patch_openai_agents
import agentviz
agentviz.init(server="http://localhost:8000")
patch_openai_agents() # patches globally — all agents auto-traced from here
OpenTelemetry
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from agentviz.integrations.otel import AgentVizSpanExporter
provider = TracerProvider()
provider.add_span_processor(BatchSpanProcessor(AgentVizSpanExporter()))
Every OTEL span becomes a call/response/error event in the 3D scene automatically.
MCP server (Claude Desktop / Cursor)
Add to your MCP host config:
{
"mcpServers": {
"agentviz": {
"command": "python",
"args": ["-m", "agentviz.mcp_server"],
"env": {
"AGENTVIZ_SERVER": "http://localhost:8000"
}
}
}
}
Then use agentviz_emit_call, agentviz_emit_response, agentviz_emit_token etc. as tools from within Claude.
Self-hosting
Docker
docker build -t agentviz .
docker run -p 8000:8000 agentviz
docker-compose
docker-compose up
Railway / Render / Fly.io
Push the repo and set the start command to:
agentviz serve --host 0.0.0.0 --port $PORT --no-browser
CLI
agentviz serve # start server, open browser
agentviz serve --port 9000 # custom port
agentviz serve --no-browser # headless (for servers)
agentviz serve --demo # also start demo agents
agentviz --version
Or:
python -m agentviz serve
Multi-room / multi-team
Each URL ?room=<name> gets its own isolated scene, agent registry, and session history. Share a single deployed server across multiple teams:
https://agentviz.mycompany.com/?room=search-team
https://agentviz.mycompany.com/?room=billing-team
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file agentviz-0.3.0.tar.gz.
File metadata
- Download URL: agentviz-0.3.0.tar.gz
- Upload date:
- Size: 52.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.10
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
37326c870f6476f205bdc6162f486263a81a4d6304e9d563c6bbfd535867f464
|
|
| MD5 |
f1d12c2448a4eb976ef2a9d8ce3a280f
|
|
| BLAKE2b-256 |
caae9ddc30602bc820f2e788561bf358b714ae0e20eab8a66ace82b938e32b9b
|
File details
Details for the file agentviz-0.3.0-py3-none-any.whl.
File metadata
- Download URL: agentviz-0.3.0-py3-none-any.whl
- Upload date:
- Size: 79.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.10
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6c4b1a98b4bdf69f39973c581414af081617e9da1d98c033050e05827db57a1c
|
|
| MD5 |
96f08013381f16deb562a98b320d0763
|
|
| BLAKE2b-256 |
9cd86dbdb028a7b703ae2fd86755e43bcc755786f160b6667395a1ae76b136bf
|