Skip to main content

MCP server that instruments Python AI agents with the ioa-observe-sdk

Project description

observe-instrument-mcp

An MCP server that automatically instruments Python AI agents with the ioa-observe-sdk — adding OpenTelemetry-based tracing, metrics, and logs with zero manual effort.

Works with any MCP-compatible AI coding assistant: Claude Desktop, Cursor, Windsurf, and others.

What it does

Two tools:

instrument_agent — reads a Python agent file, applies full observe SDK instrumentation, writes it back, and returns a summary of changes. Creates a .bak backup before modifying.

check_instrumentation — audits a file for missing instrumentation without modifying it.

Supported frameworks: LlamaIndex, LangGraph, CrewAI, raw OpenAI SDK.

Installation

pip install observe-instrument-mcp
# or
uv add observe-instrument-mcp

Requires an API key for your chosen LLM provider. Defaults to Claude (ANTHROPIC_API_KEY). See supported providers below.

Configuration

Claude Desktop

Add to ~/Library/Application Support/Claude/claude_desktop_config.json:

{
  "mcpServers": {
    "observe-instrument": {
      "command": "uvx",
      "args": ["observe-instrument-mcp"],
      "env": {
        "ANTHROPIC_API_KEY": "sk-ant-..."
      }
    }
  }
}

Cursor

Add to .cursor/mcp.json in your project:

{
  "mcpServers": {
    "observe-instrument": {
      "command": "uvx",
      "args": ["observe-instrument-mcp"],
      "env": {
        "ANTHROPIC_API_KEY": "sk-ant-..."
      }
    }
  }
}

Windsurf

Add to ~/.codeium/windsurf/mcp_config.json:

{
  "mcpServers": {
    "observe-instrument": {
      "command": "uvx",
      "args": ["observe-instrument-mcp"],
      "env": {
        "ANTHROPIC_API_KEY": "sk-ant-..."
      }
    }
  }
}

Usage

Once configured, ask your AI assistant:

Instrument my agent with the observe SDK: path/to/my_agent.py
Check what observe SDK instrumentation is missing from path/to/my_agent.py

Environment variables

Variable Description
LLM_MODEL Model to use (default: claude-sonnet-4-6). See provider table below.
ANTHROPIC_API_KEY Required for Anthropic models
OPENAI_API_KEY Required for OpenAI models
GEMINI_API_KEY Required for Google Gemini models
GROQ_API_KEY Required for Groq models

Supported providers

Provider Key variable LLM_MODEL example
Anthropic ANTHROPIC_API_KEY claude-sonnet-4-6
OpenAI OPENAI_API_KEY gpt-4o
Google Gemini GEMINI_API_KEY gemini/gemini-2.0-flash
Groq GROQ_API_KEY groq/llama-3.3-70b
Ollama (local, free) none ollama/llama3.2

After instrumentation

Install the SDK in your project:

pip install ioa-observe-sdk
# or
uv add ioa-observe-sdk

Start the observability stack (OTel Collector + ClickHouse):

cd path/to/observe/deploy
docker compose up -d

Run your agent:

OPENAI_API_KEY=sk-... OTLP_HTTP_ENDPOINT=http://localhost:4318 python my_agent.py

Query traces:

docker exec -it clickhouse-server clickhouse-client --user admin --password admin
SELECT SpanName, ServiceName, Duration / 1000000. AS ms, Timestamp
FROM otel_traces
ORDER BY Timestamp DESC
LIMIT 20;

Development

git clone https://github.com/alanzha2/observe-instrument-mcp
cd observe-instrument-mcp
pip install -e .

# Test the server locally
mcp dev observe_instrument_mcp/server.py

License

Apache-2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

observe_instrument_mcp-0.1.1.tar.gz (11.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

observe_instrument_mcp-0.1.1-py3-none-any.whl (12.4 kB view details)

Uploaded Python 3

File details

Details for the file observe_instrument_mcp-0.1.1.tar.gz.

File metadata

  • Download URL: observe_instrument_mcp-0.1.1.tar.gz
  • Upload date:
  • Size: 11.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for observe_instrument_mcp-0.1.1.tar.gz
Algorithm Hash digest
SHA256 ae6117db57252f25a71a1d4335d7cf5e86e0dc9c363a74a01fcff109269424f6
MD5 40db9045bf0e21caf8c59864589a78cf
BLAKE2b-256 8c453008ef439e7a9f097e205973f394137b92d14d048235c74aede608997f1a

See more details on using hashes here.

Provenance

The following attestation bundles were made for observe_instrument_mcp-0.1.1.tar.gz:

Publisher: publish.yml on alanzha2/observe-instrument-mcp

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file observe_instrument_mcp-0.1.1-py3-none-any.whl.

File metadata

File hashes

Hashes for observe_instrument_mcp-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 29a5304e9a6b4c877de8ba7e762bdf6dcb56a0b4614e5f6ffaab718a086ed993
MD5 0a594b7fa68d3e27d4630d90004c8d44
BLAKE2b-256 00d579ec1c9a8f69259a68cf64bb82853b1d35adcc125a33a83536f62fb5752c

See more details on using hashes here.

Provenance

The following attestation bundles were made for observe_instrument_mcp-0.1.1-py3-none-any.whl:

Publisher: publish.yml on alanzha2/observe-instrument-mcp

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page