Skip to main content

Observability and reliability platform for agentic AI systems

Project description

Anjor

CI Coverage PyPI Python License: MIT

AI agents fail silently. A tool times out, a schema drifts, the context window fills up — and you find out from a user complaint, not a dashboard.

Anjor fixes that. It intercepts your agent's HTTP traffic at the protocol layer and gives you full visibility into every LLM call, tool use, and MCP server interaction — latency, token usage, context window growth, schema drift, prompt changes — without changing a single line of your agent code. Beyond passive logging, it surfaces actionable intelligence: failure pattern clustering, token optimization suggestions, and per-tool quality grades (A–F).

One-line install. No cloud. No account required.


Dashboard

Overview

The overview gives a bulls-eye view across the whole platform — tool health, LLM cost, MCP server status, top failure patterns, and drift alerts at a glance.

More screenshots

LLM Usage — token consumption and cost by model with daily trend and cache savings

Usage

MCP Servers — per-server call volume, success rates, and tool breakdown

MCP

Intelligence — failure clusters, token optimization opportunities, and quality scores

Intelligence

Tools — latency percentiles, drift detection, and per-tool drill-down

Tools

Traces — multi-agent span trees and cross-agent attribution

Traces


Install

pip install anjor

Quickstart

1. Start the collector and dashboard (one command, one port):

anjor start
# Anjor collector  http://localhost:7843/health
# Anjor dashboard  http://localhost:7843/ui/

2. Add one line to your agent:

import anjor
anjor.patch()   # that's it — httpx is now instrumented

import anthropic
client = anthropic.Anthropic()
# make calls as normal — they're captured automatically

Open http://localhost:7843/ui/ to see the dashboard.

3. Query the API directly:

curl http://localhost:7843/health
curl http://localhost:7843/tools
curl http://localhost:7843/mcp
curl http://localhost:7843/intelligence/failures
curl http://localhost:7843/intelligence/quality/tools

What it captures

Signal Details
Tool calls Name, status (success/failure), failure type, latency
MCP servers Per-server call volume, success rate, latency — parsed from mcp__server__tool naming
Schema fingerprints SHA-256 structural hash of tool input/output shape
Schema drift Field-level diff against the baseline for each tool
LLM calls Model, latency, finish reason — Anthropic, OpenAI, and Gemini
Token usage Input + output + cache_read + cache_write tokens per call
Context window Tokens used vs model limit, utilisation %, per-trace growth
Cache savings Prompt cache hit rate and estimated cost savings
Context hogs Per-tool average output size, % of context consumed
System prompt drift SHA-256 per agent — alerts when prompt changes between calls
Failure patterns Clustered failure analysis with descriptions and fix suggestions
Token optimization Tools consuming >5% of context window, cost savings estimates
Quality scores Per-tool reliability/schema-stability/latency grade (A–F)
Run quality Per-trace context efficiency, failure recovery, diversity grade
Multi-agent spans Parent/child span linking across agent boundaries
Trace graphs DAG reconstruction, topological order, cycle detection
Cross-agent attribution Token usage and failure rate broken down per agent

Supported providers

Provider SDK Intercepted endpoint
Anthropic anthropic api.anthropic.com/v1/messages
OpenAI openai api.openai.com/v1/chat/completions
Google Gemini google-generativeai generativelanguage.googleapis.com/.../generateContent

All three providers are auto-detected — no configuration required.


MCP tool analytics

MCP tools are automatically identified by their naming convention — no extra configuration needed. Any tool whose name follows mcp__<server>__<tool> is grouped by server in the MCP dashboard:

mcp__github__create_pull_request   →  server: github,     tool: create_pull_request
mcp__filesystem__read_file         →  server: filesystem, tool: read_file
mcp__brave_search__web_search      →  server: brave_search, tool: web_search

The /mcp endpoint returns per-server and per-tool aggregates and supports a ?days=N filter.


API endpoints

Method Path Description
POST /events Ingest a tool/LLM/span event
GET /tools All tools with summary stats
GET /tools/{name} Tool detail (latency percentiles, drift)
GET /mcp MCP server and tool aggregates (?days=N)
GET /llm LLM call summary by model (?days=N)
GET /llm/usage/daily Daily token usage by model (?days=N)
GET /calls Paginated raw event log
GET /traces Trace list (newest first)
GET /traces/{id}/graph DAG graph for a single trace
GET /health Uptime, queue depth, db path
GET /intelligence/failures Failure clusters sorted by rate
GET /intelligence/optimization Token hog tools + savings estimates
GET /intelligence/quality/tools Per-tool quality scores + grade
GET /intelligence/quality/runs Per-trace run quality scores + grade
GET /intelligence/attribution Per-agent token and failure attribution

Configuration

Via environment variables:

ANJOR_DB_PATH=./my_project.db python my_agent.py
ANJOR_BATCH_SIZE=1 ANJOR_BATCH_INTERVAL_MS=100 python my_agent.py
ANJOR_LOG_LEVEL=DEBUG python my_agent.py

Via .anjor.toml in your project root:

db_path = "my_project.db"
batch_size = 10
batch_interval_ms = 200
log_level = "DEBUG"

Via code:

import anjor
from anjor.core.config import AnjorConfig

anjor.patch(config=AnjorConfig(db_path="my_project.db", batch_size=1))

Limitations

  • requests library not intercepted — all three provider SDKs use httpx by default
  • Streaming responses are not parsed; only non-streaming calls are captured
  • No cloud sync, authentication, or team features

Development

git clone https://github.com/anjor-labs/anjor.git
cd anjor
pip install -e ".[dev]"
pytest --cov=anjor --cov-fail-under=95 -q
ruff check anjor/ tests/
mypy anjor/
anjor start

See CONTRIBUTING.md for full guidelines.


Documentation


Contributing & Contact

License

MIT © Anjor Labs

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

anjor-0.6.0.tar.gz (81.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

anjor-0.6.0-py3-none-any.whl (112.2 kB view details)

Uploaded Python 3

File details

Details for the file anjor-0.6.0.tar.gz.

File metadata

  • Download URL: anjor-0.6.0.tar.gz
  • Upload date:
  • Size: 81.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for anjor-0.6.0.tar.gz
Algorithm Hash digest
SHA256 48d9bbd4356492d5e7b58acc9a7f5d06db539795fd41cca77740edd815e3e1e8
MD5 8a89a9010682d7ae13cd159b45e8fd79
BLAKE2b-256 2b8b95ad587dfa1065a7c34db68c87fca5d2e982b4f43f637c2d2d49f438e7f2

See more details on using hashes here.

Provenance

The following attestation bundles were made for anjor-0.6.0.tar.gz:

Publisher: publish.yml on anjor-labs/anjor

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file anjor-0.6.0-py3-none-any.whl.

File metadata

  • Download URL: anjor-0.6.0-py3-none-any.whl
  • Upload date:
  • Size: 112.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for anjor-0.6.0-py3-none-any.whl
Algorithm Hash digest
SHA256 0e6b83db0e72b430e1bd46129a65a874ff9ac3cef263333b48bd836a00d1207e
MD5 ec213db8229303d50c45a7e4c5d38e9a
BLAKE2b-256 4448a4853cb7ea3195e29465cab6aec42236e4a50bbcbd259bb365ca20c04460

See more details on using hashes here.

Provenance

The following attestation bundles were made for anjor-0.6.0-py3-none-any.whl:

Publisher: publish.yml on anjor-labs/anjor

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page