Skip to main content

LLM Observability

Project description

Arize Phoenix logo
arize-phoenix-otel

PyPI Version Documentation

Provides a lightweight wrapper around OpenTelemetry primitives with Phoenix-aware defaults. Phoenix OTEL also gives you access to tracing decorators for common GenAI patterns.

Features

arize-phoenix-otel simplifies OpenTelemetry configuration for Phoenix users by providing:

  • Phoenix-aware defaults for common OpenTelemetry primitives
  • Automatic configuration from environment variables
  • Drop-in replacements for OTel classes with enhanced functionality
  • Simplified tracing setup with the register() function
  • Tracing decorators for GenAI patterns

Key Benefits

  • Zero Code Changes: Enable auto_instrument=True to automatically instrument AI libraries
  • Production Ready: Built-in batching and authentication
  • Phoenix Integration: Seamless integration with Phoenix Cloud and self-hosted instances
  • OpenTelemetry Compatible: Works with existing OpenTelemetry infrastructure

These defaults are aware of environment variables you may have set to configure Phoenix:

  • PHOENIX_COLLECTOR_ENDPOINT
  • PHOENIX_PROJECT_NAME
  • PHOENIX_CLIENT_HEADERS
  • PHOENIX_API_KEY
  • PHOENIX_GRPC_PORT

Installation

Install via pip:

pip install arize-phoenix-otel

Quick Start

Recommended: Enable automatic instrumentation to trace your AI libraries with zero code changes:

from phoenix.otel import register

# Recommended: Automatic instrumentation + production settings
tracer_provider = register(
    auto_instrument=True,  # Auto-trace OpenAI, LangChain, LlamaIndex, etc.
    batch=True,           # Production-ready batching
    project_name="my-app" # Organize your traces
)

That's it! All openinference-* AI libraries are now automatically traced and sent to Phoenix.

Note: auto_instrument=True only works if the corresponding OpenInference instrumentation libraries are installed. For example, to automatically trace OpenAI calls, you need openinference-instrumentation-openai installed:

pip install openinference-instrumentation-openai
pip install openinference-instrumentation-langchain  # For LangChain
pip install openinference-instrumentation-llama-index  # For LlamaIndex

See the OpenInference repository for the complete list of available instrumentation packages.

Authentication

export PHOENIX_API_KEY="your-api-key"
# Or pass directly to register()
tracer_provider = register(api_key="your-api-key")

Endpoint Configuration

Configure where to send your traces:

Environment Variables (Recommended):

export PHOENIX_COLLECTOR_ENDPOINT="https://app.phoenix.arize.com/s/your-space"
export PHOENIX_PROJECT_NAME="my-project"

Direct Configuration:

tracer_provider = register(
    endpoint="http://localhost:6006/v1/traces",  # HTTP endpoint
    protocol="grpc"  # Or force gRPC protocol
)

Usage Examples

Simple Setup

from phoenix.otel import register

# Basic setup - sends to localhost
tracer_provider = register(auto_instrument=True)

Production Configuration

tracer_provider = register(
    project_name="my-production-app",
    auto_instrument=True,      # Auto-trace AI/ML libraries
    batch=True,               # Background batching for performance
    api_key="your-api-key",   # Authentication
    endpoint="https://app.phoenix.arize.com/s/your-space"
)

Manual Configuration

For advanced use cases, use Phoenix OTEL components directly:

from phoenix.otel import TracerProvider, BatchSpanProcessor, HTTPSpanExporter

tracer_provider = TracerProvider()
exporter = HTTPSpanExporter(endpoint="http://localhost:6006/v1/traces")
processor = BatchSpanProcessor(span_exporter=exporter)
tracer_provider.add_span_processor(processor)

Using Decorators

from phoenix.otel import register

tracer_provider = register()

# Get a tracer for manual instrumentation
tracer = tracer_provider.get_tracer(__name__)

@tracer.chain
def process_data(data):
    return data + " processed"

@tracer.tool
def weather(location):
    return "sunny"

Environment Variables

Variable Description Example
PHOENIX_COLLECTOR_ENDPOINT Where to send traces https://app.phoenix.arize.com/s/your-space
PHOENIX_PROJECT_NAME Project name my-llm-app
PHOENIX_API_KEY Authentication key your-api-key
PHOENIX_CLIENT_HEADERS Custom headers Authorization=Bearer token
PHOENIX_GRPC_PORT gRPC port override 4317

Coding Agent Skill

The Phoenix repo includes a phoenix-tracing skill that teaches coding agents (Claude Code, Cursor, etc.) how to instrument LLM applications with OpenInference tracing. Install it with:

npx skills add Arize-ai/phoenix --skill phoenix-tracing

Documentation

Community

Join our community to connect with thousands of AI builders:

  • 🌍 Join our Slack community.
  • 💡 Ask questions and provide feedback in the #phoenix-support channel.
  • 🌟 Leave a star on our GitHub.
  • 🐞 Report bugs with GitHub Issues.
  • 𝕏 Follow us on 𝕏.
  • 💼 Follow us on LinkedIn.
  • 🗺️ Check out our roadmap to see where we're heading next.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

arize_phoenix_otel-0.16.1.tar.gz (20.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

arize_phoenix_otel-0.16.1-py3-none-any.whl (18.0 kB view details)

Uploaded Python 3

File details

Details for the file arize_phoenix_otel-0.16.1.tar.gz.

File metadata

  • Download URL: arize_phoenix_otel-0.16.1.tar.gz
  • Upload date:
  • Size: 20.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for arize_phoenix_otel-0.16.1.tar.gz
Algorithm Hash digest
SHA256 28ba1d43dcd2dc556f6cac25622c3333e613e2cf1d687832a9ed5ce2c20e17be
MD5 c47d1ac2a0801f03e34b2761cdfc36cd
BLAKE2b-256 37975aef2776ee17eacb24a82de719312008ca0a16b915415cc9363a9a9b1536

See more details on using hashes here.

Provenance

The following attestation bundles were made for arize_phoenix_otel-0.16.1.tar.gz:

Publisher: publish.yaml on Arize-ai/phoenix

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file arize_phoenix_otel-0.16.1-py3-none-any.whl.

File metadata

File hashes

Hashes for arize_phoenix_otel-0.16.1-py3-none-any.whl
Algorithm Hash digest
SHA256 eb71f7be8d94e68f5cd82b134f7ea1710fa73374bf0d8d663ce6242332871ff9
MD5 cc175a263176be2ff24994750d3562c1
BLAKE2b-256 ee2ee495446a8a715d9892a4dbc56e11b18fb6f07a634182babc0b0ac6827d9c

See more details on using hashes here.

Provenance

The following attestation bundles were made for arize_phoenix_otel-0.16.1-py3-none-any.whl:

Publisher: publish.yaml on Arize-ai/phoenix

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page