LlamaIndex integration for the Dominion Observatory — instrument llama_index.tools.mcp.McpToolSpec tool calls with anonymised runtime telemetry to the cross-ecosystem MCP trust layer.
Project description
dominion-observatory-llamaindex
LlamaIndex integration for the Dominion Observatory — the behavioural trust layer for the AI agent economy. Instrument llama_index.tools.mcp.McpToolSpec so every MCP tool call your LlamaIndex agent makes emits a single six-field runtime telemetry report.
Nothing sensitive ever leaves your process: only agent_id, server_url, success, latency_ms, tool_name, and http_status. No prompts, no tool arguments, no tool outputs, no user IDs, no network metadata. Singapore PDPA compliant, EU AI Act Article 12 compatible.
Why
The Observatory already watches 4,500+ MCP servers via direct probes + user reports. What it cannot see on its own is what real LlamaIndex agents experience against those servers in production — which tools actually work for which agents at what latency, and which servers quietly fail under real traffic. This adapter closes that gap.
Install
pip install 'dominion-observatory-llamaindex[llamaindex]'
The [llamaindex] extra pulls in llama-index-tools-mcp>=0.4.0. The package itself only hard-depends on dominion-observatory-sdk>=0.2.0, so it is safe to install first in layered Dockerfiles.
Usage — drop-in replacement
Swap McpToolSpec for ObservatoryMcpToolSpec:
from llama_index.tools.mcp import BasicMCPClient
from dominion_observatory_llamaindex import ObservatoryMcpToolSpec
client = BasicMCPClient("https://my-mcp-server.example.com/mcp")
spec = ObservatoryMcpToolSpec(
client=client,
agent_id="acme-scheduler@1.2.0",
# server_url is inferred from client.command_or_url when omitted.
)
tools = await spec.to_tool_list_async()
# Hand `tools` to any LlamaIndex agent (ReActAgent, FunctionAgent, Workflow, etc.).
Every tool returned by to_tool_list_async() will emit one Observatory report per call_tool invocation, whether the call succeeds, fails at the MCP level (CallToolResult.isError=True), or raises an exception.
Usage — wrap an existing spec
If a user already has an McpToolSpec somewhere downstream, instrument_tool_spec clones its config into an instrumented replacement without touching the original:
from llama_index.tools.mcp import BasicMCPClient, McpToolSpec
from dominion_observatory_llamaindex import instrument_tool_spec
client = BasicMCPClient("https://my-mcp-server.example.com/mcp")
spec = McpToolSpec(client=client, allowed_tools=["query"])
spec = instrument_tool_spec(
spec,
agent_id="acme-scheduler@1.2.0",
)
tools = await spec.to_tool_list_async()
agent_id conventions
- Required. Must be a non-empty string.
- The strings
anonymousandobservatory_probeare reserved for Observatory internals and are rejected at construction time. - Recommended shape:
<app>-<role>@<semver>, e.g.acme-scheduler@1.2.0. That way the Observatory can distinguish deploys when you bump versions.
server_url resolution
- If passed explicitly, that value wins.
- Otherwise the wrapper reads
client.command_or_url(the fieldBasicMCPClientuses), thenclient.url, thenclient.server_url. - If none of those produce a non-empty string, telemetry is disabled for the spec — tool calls still execute normally. This is the safe default for stdio MCP servers that have no URL identity.
What is reported
Exactly six fields per invocation, nothing else:
| field | type | value |
|---|---|---|
agent_id |
str | stable app identifier you passed to the constructor |
server_url |
str | MCP server URL (explicit or derived from the client) |
success |
bool | True if the call returned and isError was falsy |
latency_ms |
int | wall-clock elapsed around client.call_tool(...) |
tool_name |
str | the MCP tool name dispatched |
http_status |
int | 200 on success, 500 on error (MCP-level or exception) |
Failure policy
- An exception raised by
call_toolis caught long enough to emit asuccess=False, http_status=500report, then re-raised unchanged. Your agent code sees the original traceback. - A report emission failure (network error, SDK import error, SDK bug) is suppressed at DEBUG level. Telemetry never breaks the agent's critical path.
- If the Observatory SDK is absent at runtime, the wrapper still runs and logs a one-line DEBUG message per dropped report.
Privacy posture
- No tool arguments are sent.
- No tool outputs are sent.
- No user IDs, chat history, prompts, or embeddings are sent.
agent_idis opaque to the Observatory — pick any stable identifier that does not personally identify an end-user. Recommended: application build ID.
Part of the Dominion Agent Economy Empire
This adapter is part of a family:
| Package | Framework | PyPI |
|---|---|---|
dominion-observatory-sdk |
framework-free | https://pypi.org/project/dominion-observatory-sdk/ |
dominion-observatory-langchain |
LangChain | https://pypi.org/project/dominion-observatory-langchain/ |
dominion-observatory-crewai |
CrewAI | https://pypi.org/project/dominion-observatory-crewai/ |
dominion-observatory-autogen |
Microsoft AutoGen | https://pypi.org/project/dominion-observatory-autogen/ |
dominion-observatory-llamaindex |
LlamaIndex | this package |
Licence
MIT. See LICENSE.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file dominion_observatory_llamaindex-0.1.0.tar.gz.
File metadata
- Download URL: dominion_observatory_llamaindex-0.1.0.tar.gz
- Upload date:
- Size: 9.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ca6abd4b9e1a4d462f5f05eb09bcb723ceba24e0821f24c394b21f80b9a16647
|
|
| MD5 |
f44329e94b5960af1104daf0a580f5c3
|
|
| BLAKE2b-256 |
0a48816370e8bf42b1d7ee48e46be68d4cccf09c43c73667a25e86de6b81ddbd
|
File details
Details for the file dominion_observatory_llamaindex-0.1.0-py3-none-any.whl.
File metadata
- Download URL: dominion_observatory_llamaindex-0.1.0-py3-none-any.whl
- Upload date:
- Size: 9.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
484462b3f4d09aa0b9f0ac10c5167fa82783c82352f7843bdd11041f9054f16f
|
|
| MD5 |
640ad59a40e6135a78a107653a07c88c
|
|
| BLAKE2b-256 |
6409a3af4b30e90984759b76c40f815d1cbbe2adec8ac1677738aec78472a947
|