Skip to main content

First-party OpenInference-shaped tracing for Python LLM and agent applications on Catalyst by Inference.net.

Project description

catalyst-tracing

First-party OpenInference-shaped tracing for Python LLM and agent applications running on Catalyst by Inference.net.

catalyst-tracing gives you one Python package for instrumenting common model SDKs, agent frameworks, and custom agent work. It emits OpenTelemetry spans with OpenInference-compatible attributes over OTLP/HTTP so Catalyst can display model calls, tool calls, prompts, responses, token usage, and parent-child agent flows.

This package is currently in beta. APIs may change before 1.0, but the package name and import path are intended to remain stable.

Install

Install the base tracing runtime:

pip install catalyst-tracing

Install only the integrations your application uses:

pip install 'catalyst-tracing[openai]'
pip install 'catalyst-tracing[anthropic]'
pip install 'catalyst-tracing[langchain]'
pip install 'catalyst-tracing[langgraph]'
pip install 'catalyst-tracing[langsmith]'
pip install 'catalyst-tracing[openai-agents]'
pip install 'catalyst-tracing[claude-agent-sdk]'
pip install 'catalyst-tracing[pydantic-ai]'

You can combine extras:

pip install 'catalyst-tracing[openai,anthropic,langchain]'

Quick Start

Set your Catalyst endpoint and token:

export CATALYST_OTLP_ENDPOINT="https://your-catalyst-otlp-endpoint"
export CATALYST_OTLP_TOKEN="your-token"
export CATALYST_SERVICE_NAME="checkout-agent"

Initialize tracing before creating SDK clients:

from catalyst_tracing import setup
from openai import OpenAI

tracing = setup()

client = OpenAI()
response = client.chat.completions.create(
    model="gpt-4o-mini",
    messages=[{"role": "user", "content": "Summarize this order."}],
)

tracing.shutdown()

The OpenAI call is captured as an OpenInference-shaped LLM span and exported to Catalyst through OTLP/HTTP.

What It Instruments

Integration Install extra What is captured
OpenAI openai Chat Completions, Responses, sync clients, and async clients
Anthropic anthropic Messages API calls, sync clients, and async clients
LangChain langchain Callback-manager driven chain, model, tool, and retriever spans
LangGraph langgraph Graph and node spans through the LangChain callback path
LangSmith langsmith LangSmith OpenTelemetry spans bridged into the Catalyst provider
OpenAI Agents openai-agents Agent runs plus nested OpenAI model spans
Claude Agent SDK claude-agent-sdk query() calls and yielded agent messages
Pydantic AI pydantic-ai Pydantic AI's native OpenTelemetry instrumentation

The base package includes the tracing runtime. Extras install the upstream SDKs themselves so you can keep production environments narrow.

Public API

Most applications only need setup():

from catalyst_tracing import setup

tracing = setup(
    service_name="support-agent",
    service_version="0.4.0",
)

setup() returns a CatalystTracing handle with:

Attribute Purpose
provider OpenTelemetry TracerProvider configured for Catalyst export
tracer Tracer for manual spans
install_results Per-integration install results
shutdown() Flush and close tracing before process exit

You can also import integration installers directly:

from catalyst_tracing import setup
from catalyst_tracing.openai import install_openai

tracing = setup(auto_instrument=False)
install_openai(tracing.provider)

Available entry-point modules:

Import Export
catalyst_tracing.openai install_openai
catalyst_tracing.anthropic install_anthropic
catalyst_tracing.langchain install_langchain
catalyst_tracing.langgraph install_langgraph
catalyst_tracing.langsmith install_langsmith
catalyst_tracing.openai_agents install_openai_agents
catalyst_tracing.claude_agent_sdk install_claude_agent_sdk
catalyst_tracing.pydantic_ai install_pydantic_ai

Manual Agent Spans

Use agent_span() when work does not go through a supported SDK, such as a CLI subprocess, custom router, planner, evaluator, or tool executor.

from catalyst_tracing import agent_span, setup

tracing = setup()

with agent_span(tracing.tracer, name="RefundReviewAgent", system="internal") as span:
    span.set_input("Review refund request #1842")
    decision = run_refund_review()
    span.set_output(decision.summary)
    span.record_tokens(prompt=820, completion=160)

tracing.shutdown()

Any child spans created inside the context automatically parent under the agent span through standard OpenTelemetry context propagation.

Configuration

You can configure tracing with keyword arguments or environment variables.

Option Environment variable Default
endpoint CATALYST_OTLP_ENDPOINT http://localhost:8799
token CATALYST_OTLP_TOKEN unset
service_name CATALYST_SERVICE_NAME generated catalyst-app-* name
service_version CATALYST_SERVICE_VERSION 0.0.1
debug CATALYST_DEBUG false
batching none "batch"

Legacy OTLP_ENDPOINT, OTLP_INGEST_TOKEN, and SERVICE_NAME variables are also accepted for compatibility.

Span Shape

Spans use OpenInference-style semantic attributes so LLM-aware viewers can understand them without custom adapters:

Attribute family Examples
Span kind openinference.span.kind
Inputs and outputs input.value, output.value
Messages llm.input_messages.*, llm.output_messages.*
Model metadata llm.model_name, llm.invocation_parameters
Token counts llm.token_count.prompt, llm.token_count.completion, llm.token_count.total
Provider/system gen_ai.system

Constants are exported for custom spans:

from catalyst_tracing import Attr, SpanKindValues

span.set_attribute(Attr.SPAN_KIND, SpanKindValues.LLM.value)
span.set_attribute(Attr.MODEL_NAME, "gpt-4o-mini")

Error Handling

The package raises typed errors for misuse and returns structured install results for optional integrations:

from catalyst_tracing import CatalystTracingError, InvalidTracerProviderError
from catalyst_tracing.openai import install_openai

try:
    result = install_openai(provider)
except InvalidTracerProviderError as exc:
    print(exc.code)
except CatalystTracingError:
    raise

Each installer returns an InstrumentResult with:

Field Meaning
name Integration name
installed Whether instrumentation was installed
code Stable status code such as INSTALLED or SDK_NOT_INSTALLED
reason Human-readable detail when installation is skipped

Package Names

The primary package is catalyst-tracing and the primary import path is catalyst_tracing.

Inference also publishes inference-catalyst-tracing as a company-qualified install name. It depends on this package and re-exports the same public API from the inference_catalyst_tracing import path.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

catalyst_tracing-0.0.2.tar.gz (35.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

catalyst_tracing-0.0.2-py3-none-any.whl (51.1 kB view details)

Uploaded Python 3

File details

Details for the file catalyst_tracing-0.0.2.tar.gz.

File metadata

  • Download URL: catalyst_tracing-0.0.2.tar.gz
  • Upload date:
  • Size: 35.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.11.7 {"installer":{"name":"uv","version":"0.11.7","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for catalyst_tracing-0.0.2.tar.gz
Algorithm Hash digest
SHA256 30995cbf512418324b7bacd1b45775bf17b428579e7c455985c140b297394203
MD5 97daf17f594ce1ed8018f2646f810db1
BLAKE2b-256 ff1ec43c6fc3dae05b8806d6f824b9bccc1f266167ddde1ad1a19a62091b363c

See more details on using hashes here.

File details

Details for the file catalyst_tracing-0.0.2-py3-none-any.whl.

File metadata

  • Download URL: catalyst_tracing-0.0.2-py3-none-any.whl
  • Upload date:
  • Size: 51.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.11.7 {"installer":{"name":"uv","version":"0.11.7","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for catalyst_tracing-0.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 194993b7f7551cd6a0731e0077f5f6af550275781d5a6ce6a64824cabb445ac6
MD5 3fdf52a192e9a9e66d1d1a27010b6832
BLAKE2b-256 5fc3e6676133e15e732b5f6c88fc3abc81ea84325a9dd8bd740a8152fa6fa111

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page