Skip to main content

PrysmAI Python SDK for the proxy and MCP paths into one AI control plane. Capture traces, security findings, policy decisions, and governance evidence for production AI systems.

Project description

Prysm AI Python SDK

PrysmAI is the control plane for production AI.

This SDK gives you two integration paths into the same Prysm control plane:

  • Proxy path for application traffic you route through Prysm
  • MCP path for agent runtimes that connect to Prysm as a governance and evidence surface

Both paths should produce the same operational outcome in Prysm:

  • request traces
  • security findings
  • policy decisions
  • governance sessions
  • reviewable evidence

PyPI version Python 3.9+ License: MIT

Your App          -> Prysm Proxy (/api/v1) -> Model Provider
Agent Runtime     -> Prysm MCP   (/api/mcp) -> Same control plane

Installation

pip install prysmai

# Optional integrations
pip install prysmai[langgraph]
pip install prysmai[crewai]
pip install prysmai[agent-framework]
pip install prysmai[all]

Requires Python 3.9+.

The Golden Paths

1. Proxy path

Use this when you are building an AI application directly and want Prysm in the request path.

from prysmai import PrysmClient

prysm = PrysmClient(
    prysm_key="sk-prysm-...",
    base_url="https://prysmai.io/api/v1",
)

client = prysm.llm()

response = client.chat.completions.create(
    model="gpt-4o-mini",
    messages=[{"role": "user", "content": "Explain quantum computing simply."}],
)

print(response.choices[0].message.content)

2. Wrap an existing OpenAI client

Use this when you already have an OpenAI client and want to add Prysm without rewriting the rest of your app.

from openai import OpenAI
from prysmai import monitor

client = OpenAI()
monitored = monitor(client, prysm_key="sk-prysm-...")

response = monitored.chat.completions.create(
    model="gpt-4o-mini",
    messages=[{"role": "user", "content": "Summarize the meeting notes."}],
)

3. MCP path

Use this when your runtime connects to MCP-compatible tools and you want Prysm to act as the control and evidence layer.

from prysmai import PrysmClient

prysm = PrysmClient(prysm_key="sk-prysm-...")
mcp = prysm.mcp()

config = mcp.connection_config()

print(config.server_url)
print(config.headers)

For MCP-compatible runtimes, hand them:

  • config.server_url
  • config.headers

Then use Prysm's MCP tools and resources to record model calls, tool activity, decisions, file changes, and governance evidence.

4. Unified session scope

Use PrysmClient.session(...) when you want one correlated run across proxy traffic and governance activity.

from prysmai import PrysmClient

prysm = PrysmClient(prysm_key="sk-prysm-...")

with prysm.session(
    user_id="user_123",
    metadata={"feature": "support"},
    governance_task="Resolve a customer support request safely.",
    agent_type="codex",
    auto_check_interval=1,
) as run:
    client = run.llm()
    response = client.chat.completions.create(
        model="gpt-4o-mini",
        messages=[{"role": "user", "content": "Draft a short response."}],
    )

    run.record_decision(
        description="Send a short and safe reply",
        selected_action="respond",
        severity="low",
    )

    run.run_tool(
        "search_docs",
        lambda query: {"result_count": 2, "query": query},
        "refund policy",
        tool_input={"query": "refund policy"},
    )

print(run.identifiers.session_id)
print(run.identifiers.governance_session_id)

Choosing The Right Path

Use the proxy path when:

  • your app already talks directly to an LLM provider
  • you want request/response capture automatically
  • you want security scanning on proxied traffic with minimal code changes

Use the MCP path when:

  • your runtime is MCP-native
  • you are connecting Prysm to an external agent runtime
  • you want session, decision, tool, and file evidence even when the model call happens outside Prysm's HTTP proxy

Use a unified session when:

  • one run spans model calls, tools, file changes, and governance activity
  • you want one correlated session in the Prysm dashboard

Core SDK Surface

PrysmClient

The root client for the Prysm control plane.

from prysmai import PrysmClient

prysm = PrysmClient(prysm_key="sk-prysm-...")

proxy_client = prysm.llm()
mcp_client = prysm.mcp()
session = prysm.session(governance_task="Review a change", agent_type="codex")

prysm.openai() still works as a backward-compatible alias. The newer prysm.llm() name is more honest because Prysm can route to Claude, Gemini, vLLM, Ollama, or another configured provider behind the same OpenAI-compatible surface.

prysm_context

Attach user, session, and metadata to proxied requests.

from prysmai import PrysmClient, prysm_context

client = PrysmClient(prysm_key="sk-prysm-...").openai()

with prysm_context(
    user_id="user_42",
    session_id="sess_checkout",
    metadata={"tenant": "acme", "feature": "checkout"},
):
    client.chat.completions.create(
        model="gpt-4o-mini",
        messages=[{"role": "user", "content": "Help me check out."}],
    )

PrysmSession

Use PrysmSession helpers when you need to record governance-side events explicitly:

  • record_llm_call(...)
  • record_tool_call(...)
  • record_decision(...)
  • record_file_change(...)
  • record_delegation(...)
  • run_tool(...)
  • scan_code(...)

What Appears In Prysm

With the SDK wired correctly, Prysm can show:

  • model traces
  • latency, tokens, and cost
  • threat and policy findings
  • session events such as tool calls, decisions, and file changes
  • governance reports and reviewable evidence

Framework Integrations

The SDK also includes integrations for:

  • LangGraph
  • CrewAI
  • Microsoft Agent Framework
  • LlamaIndex

You can initialize these from the shared PrysmClient so they use the same auth and base URL model.

LangGraph

from prysmai import PrysmClient

prysm = PrysmClient(prysm_key="sk-prysm-...")
monitor = prysm.langgraph_monitor(
    user_id="user_123",
    metadata={"framework": "langgraph"},
    governance=True,
)

monitor.start_governance(
    task="Run a support workflow",
    available_tools=["search_docs"],
)

for chunk in graph.stream(
    {"question": "Handle a duplicate charge request"},
    config={"callbacks": [monitor]},
):
    ...

report = monitor.end_governance()
monitor.close()

Agent Framework

from prysmai import PrysmClient

prysm = PrysmClient(prysm_key="sk-prysm-...")
monitor = prysm.agent_framework_monitor(
    user_id="user_123",
    metadata={"framework": "agent_framework"},
    governance=True,
)

agent = client.as_agent(
    name="SupportBot",
    middleware=monitor.middleware(),
)

CrewAI and LlamaIndex

The SDK also includes:

  • prysm.crewai_monitor(...) for CrewAI event-bus telemetry
  • prysm.llamaindex_handler(...) for LlamaIndex callback telemetry

See the framework examples and developer guide for setup and optional dependencies.

Notes

  • LangGraph, Agent Framework, CrewAI, and LlamaIndex paths have all been exercised against a live local Prysm server, not just mock tests.
  • Framework integrations primarily emit telemetry and governance evidence into the same control plane used by the proxy and MCP paths.
  • Example files:
    • examples/langgraph_monitor.py
    • examples/agent_framework_monitor.py

Configuration

The SDK resolves connection settings from:

  • explicit arguments
  • then environment variables

Environment variables:

  • PRYSM_API_KEY
  • PRYSM_BASE_URL

Default base URL:

https://prysmai.io/api/v1

Local Development

For local Prysm development:

from prysmai import PrysmClient

prysm = PrysmClient(
    prysm_key="sk-prysm-...",
    base_url="http://localhost:3000/api/v1",
)

The MCP server for that same deployment will resolve to:

http://localhost:3000/api/mcp

More Documentation

Status

The SDK is still early, but the core product direction is now:

  • one control plane
  • two integration paths
  • shared evidence and governance outcomes

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

prysmai-0.8.0.tar.gz (84.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

prysmai-0.8.0-py3-none-any.whl (57.3 kB view details)

Uploaded Python 3

File details

Details for the file prysmai-0.8.0.tar.gz.

File metadata

  • Download URL: prysmai-0.8.0.tar.gz
  • Upload date:
  • Size: 84.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.11

File hashes

Hashes for prysmai-0.8.0.tar.gz
Algorithm Hash digest
SHA256 b99e320cfd6dfa8113b26d8bde7a7f4b68808c009750a16656f4ffb279765aa0
MD5 8b4f75e3b685d5ae6d44a482194d1c0b
BLAKE2b-256 dd49dd2b2d55cac5c52d2e601ef52351509c6e31fec4676dd67cb40c8d2ca514

See more details on using hashes here.

File details

Details for the file prysmai-0.8.0-py3-none-any.whl.

File metadata

  • Download URL: prysmai-0.8.0-py3-none-any.whl
  • Upload date:
  • Size: 57.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.11

File hashes

Hashes for prysmai-0.8.0-py3-none-any.whl
Algorithm Hash digest
SHA256 f1bfe195690420a7f6dba6f831c36204aee687f38d970d165d7792dbfae48425
MD5 3015411334088e2682f06507f6398e1f
BLAKE2b-256 59751cd84abeb91d52a2bfab4746354ef367662ff1ae3bf96bcde901b835fcdc

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page