Skip to main content

LLM Observability

Project description

Arize Phoenix logo
arize-phoenix-otel

PyPI Version Documentation

Provides a lightweight wrapper around OpenTelemetry primitives with Phoenix-aware defaults. Phoenix OTEL also gives you access to tracing decorators for common GenAI patterns.

Features

arize-phoenix-otel simplifies OpenTelemetry configuration for Phoenix users by providing:

  • Phoenix-aware defaults for common OpenTelemetry primitives
  • Automatic configuration from environment variables
  • Drop-in replacements for OTel classes with enhanced functionality
  • Simplified tracing setup with the register() function
  • Tracing decorators for GenAI patterns

Key Benefits

  • Zero Code Changes: Enable auto_instrument=True to automatically instrument AI libraries
  • Production Ready: Built-in batching and authentication
  • Phoenix Integration: Seamless integration with Phoenix Cloud and self-hosted instances
  • OpenTelemetry Compatible: Works with existing OpenTelemetry infrastructure

These defaults are aware of environment variables you may have set to configure Phoenix:

  • PHOENIX_COLLECTOR_ENDPOINT
  • PHOENIX_PROJECT_NAME
  • PHOENIX_CLIENT_HEADERS
  • PHOENIX_API_KEY
  • PHOENIX_GRPC_PORT

Installation

Install via pip:

pip install arize-phoenix-otel

Quick Start

Recommended: Enable automatic instrumentation to trace your AI libraries with zero code changes:

from phoenix.otel import register

# Recommended: Automatic instrumentation + production settings
tracer_provider = register(
    auto_instrument=True,  # Auto-trace OpenAI, LangChain, LlamaIndex, etc.
    batch=True,           # Production-ready batching
    project_name="my-app" # Organize your traces
)

That's it! All openinference-* AI libraries are now automatically traced and sent to Phoenix.

Note: auto_instrument=True only works if the corresponding OpenInference instrumentation libraries are installed. For example, to automatically trace OpenAI calls, you need openinference-instrumentation-openai installed:

pip install openinference-instrumentation-openai
pip install openinference-instrumentation-langchain  # For LangChain
pip install openinference-instrumentation-llama-index  # For LlamaIndex

See the OpenInference repository for the complete list of available instrumentation packages.

Authentication

export PHOENIX_API_KEY="your-api-key"
# Or pass directly to register()
tracer_provider = register(api_key="your-api-key")

Endpoint Configuration

Configure where to send your traces:

Environment Variables (Recommended):

export PHOENIX_COLLECTOR_ENDPOINT="https://app.phoenix.arize.com/s/your-space"
export PHOENIX_PROJECT_NAME="my-project"

Direct Configuration:

tracer_provider = register(
    endpoint="http://localhost:6006/v1/traces",  # HTTP endpoint
    protocol="grpc"  # Or force gRPC protocol
)

Usage Examples

Simple Setup

from phoenix.otel import register

# Basic setup - sends to localhost
tracer_provider = register(auto_instrument=True)

Production Configuration

tracer_provider = register(
    project_name="my-production-app",
    auto_instrument=True,      # Auto-trace AI/ML libraries
    batch=True,               # Background batching for performance
    api_key="your-api-key",   # Authentication
    endpoint="https://app.phoenix.arize.com/s/your-space"
)

Manual Configuration

For advanced use cases, use Phoenix OTEL components directly:

from phoenix.otel import TracerProvider, BatchSpanProcessor, HTTPSpanExporter

tracer_provider = TracerProvider()
exporter = HTTPSpanExporter(endpoint="http://localhost:6006/v1/traces")
processor = BatchSpanProcessor(span_exporter=exporter)
tracer_provider.add_span_processor(processor)

Using Decorators

from phoenix.otel import register

tracer_provider = register()

# Get a tracer for manual instrumentation
tracer = tracer_provider.get_tracer(__name__)

@tracer.chain
def process_data(data):
    return data + " processed"

@tracer.tool
def weather(location):
    return "sunny"

Environment Variables

Variable Description Example
PHOENIX_COLLECTOR_ENDPOINT Where to send traces https://app.phoenix.arize.com/s/your-space
PHOENIX_PROJECT_NAME Project name my-llm-app
PHOENIX_API_KEY Authentication key your-api-key
PHOENIX_CLIENT_HEADERS Custom headers Authorization=Bearer token
PHOENIX_GRPC_PORT gRPC port override 4317

Coding Agent Skill

The Phoenix repo includes a phoenix-tracing skill that teaches coding agents (Claude Code, Cursor, etc.) how to instrument LLM applications with OpenInference tracing. Install it with:

npx skills add Arize-ai/phoenix --skill phoenix-tracing

Documentation

Community

Join our community to connect with thousands of AI builders:

  • 🌍 Join our Slack community.
  • 💡 Ask questions and provide feedback in the #phoenix-support channel.
  • 🌟 Leave a star on our GitHub.
  • 🐞 Report bugs with GitHub Issues.
  • 𝕏 Follow us on 𝕏.
  • 🗺️ Check out our roadmap to see where we're heading next.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

arize_phoenix_otel-0.16.0.tar.gz (20.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

arize_phoenix_otel-0.16.0-py3-none-any.whl (18.0 kB view details)

Uploaded Python 3

File details

Details for the file arize_phoenix_otel-0.16.0.tar.gz.

File metadata

  • Download URL: arize_phoenix_otel-0.16.0.tar.gz
  • Upload date:
  • Size: 20.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for arize_phoenix_otel-0.16.0.tar.gz
Algorithm Hash digest
SHA256 9436595f3cdff919d45a8cfd0acbd69b0821f836e913b5279bac50a90be832c2
MD5 16e53a2332ee2261d00ddbe1de06f514
BLAKE2b-256 49c8f59e45a45ea25af242cc3726af2976787074e68101d44f8ae5501163dec0

See more details on using hashes here.

Provenance

The following attestation bundles were made for arize_phoenix_otel-0.16.0.tar.gz:

Publisher: publish.yaml on Arize-ai/phoenix

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file arize_phoenix_otel-0.16.0-py3-none-any.whl.

File metadata

File hashes

Hashes for arize_phoenix_otel-0.16.0-py3-none-any.whl
Algorithm Hash digest
SHA256 c3c455cccb583d25f1976ad56f973e12506eec9d86f2c35f2bd6c17ccfaa9943
MD5 29c67411a84abfd8a50da04e4bac754f
BLAKE2b-256 436f593f8df242ff66e3b908ce9117edde0b5ae1f624704283e12256bbb6ad25

See more details on using hashes here.

Provenance

The following attestation bundles were made for arize_phoenix_otel-0.16.0-py3-none-any.whl:

Publisher: publish.yaml on Arize-ai/phoenix

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page