LLM Observability
Project description
Provides a lightweight wrapper around OpenTelemetry primitives with Phoenix-aware defaults. Phoenix OTEL also gives you access to tracing decorators for common GenAI patterns.
Features
arize-phoenix-otel simplifies OpenTelemetry configuration for Phoenix users by providing:
- Phoenix-aware defaults for common OpenTelemetry primitives
- Automatic configuration from environment variables
- Drop-in replacements for OTel classes with enhanced functionality
- Simplified tracing setup with the
register()function - Tracing decorators for GenAI patterns
Key Benefits
- Zero Code Changes: Enable
auto_instrument=Trueto automatically instrument AI libraries - Production Ready: Built-in batching and authentication
- Phoenix Integration: Seamless integration with Phoenix Cloud and self-hosted instances
- OpenTelemetry Compatible: Works with existing OpenTelemetry infrastructure
These defaults are aware of environment variables you may have set to configure Phoenix:
PHOENIX_COLLECTOR_ENDPOINTPHOENIX_PROJECT_NAMEPHOENIX_CLIENT_HEADERSPHOENIX_API_KEYPHOENIX_GRPC_PORT
Installation
Install via pip:
pip install arize-phoenix-otel
Quick Start
Recommended: Enable automatic instrumentation to trace your AI libraries with zero code changes:
from phoenix.otel import register
# Recommended: Automatic instrumentation + production settings
tracer_provider = register(
auto_instrument=True, # Auto-trace OpenAI, LangChain, LlamaIndex, etc.
batch=True, # Production-ready batching
project_name="my-app" # Organize your traces
)
That's it! All openinference-* AI libraries are now automatically traced and sent to Phoenix.
Note: auto_instrument=True only works if the corresponding OpenInference instrumentation libraries are installed. For example, to automatically trace OpenAI calls, you need openinference-instrumentation-openai installed:
pip install openinference-instrumentation-openai
pip install openinference-instrumentation-langchain # For LangChain
pip install openinference-instrumentation-llama-index # For LlamaIndex
See the OpenInference repository for the complete list of available instrumentation packages.
Authentication
export PHOENIX_API_KEY="your-api-key"
# Or pass directly to register()
tracer_provider = register(api_key="your-api-key")
Endpoint Configuration
Configure where to send your traces:
Environment Variables (Recommended):
export PHOENIX_COLLECTOR_ENDPOINT="https://app.phoenix.arize.com/s/your-space"
export PHOENIX_PROJECT_NAME="my-project"
Direct Configuration:
tracer_provider = register(
endpoint="http://localhost:6006/v1/traces", # HTTP endpoint
protocol="grpc" # Or force gRPC protocol
)
Usage Examples
Simple Setup
from phoenix.otel import register
# Basic setup - sends to localhost
tracer_provider = register(auto_instrument=True)
Production Configuration
tracer_provider = register(
project_name="my-production-app",
auto_instrument=True, # Auto-trace AI/ML libraries
batch=True, # Background batching for performance
api_key="your-api-key", # Authentication
endpoint="https://app.phoenix.arize.com/s/your-space"
)
Manual Configuration
For advanced use cases, use Phoenix OTEL components directly:
from phoenix.otel import TracerProvider, BatchSpanProcessor, HTTPSpanExporter
tracer_provider = TracerProvider()
exporter = HTTPSpanExporter(endpoint="http://localhost:6006/v1/traces")
processor = BatchSpanProcessor(span_exporter=exporter)
tracer_provider.add_span_processor(processor)
Using Decorators
from phoenix.otel import register
tracer_provider = register()
# Get a tracer for manual instrumentation
tracer = tracer_provider.get_tracer(__name__)
@tracer.chain
def process_data(data):
return data + " processed"
@tracer.tool
def weather(location):
return "sunny"
Environment Variables
| Variable | Description | Example |
|---|---|---|
PHOENIX_COLLECTOR_ENDPOINT |
Where to send traces | https://app.phoenix.arize.com/s/your-space |
PHOENIX_PROJECT_NAME |
Project name | my-llm-app |
PHOENIX_API_KEY |
Authentication key | your-api-key |
PHOENIX_CLIENT_HEADERS |
Custom headers | Authorization=Bearer token |
PHOENIX_GRPC_PORT |
gRPC port override | 4317 |
Coding Agent Skill
The Phoenix repo includes a phoenix-tracing skill that teaches coding agents (Claude Code, Cursor, etc.) how to instrument LLM applications with OpenInference tracing. Install it with:
npx skills add Arize-ai/phoenix --skill phoenix-tracing
Documentation
- Full Documentation - Complete API reference and guides
- Phoenix Docs - Detailed tracing examples and patterns
- OpenInference - Auto-instrumentation libraries for frameworks
Community
Join our community to connect with thousands of AI builders:
- 🌍 Join our Slack community.
- 💡 Ask questions and provide feedback in the #phoenix-support channel.
- 🌟 Leave a star on our GitHub.
- 🐞 Report bugs with GitHub Issues.
- 𝕏 Follow us on 𝕏.
- 🗺️ Check out our roadmap to see where we're heading next.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file arize_phoenix_otel-0.16.0.tar.gz.
File metadata
- Download URL: arize_phoenix_otel-0.16.0.tar.gz
- Upload date:
- Size: 20.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9436595f3cdff919d45a8cfd0acbd69b0821f836e913b5279bac50a90be832c2
|
|
| MD5 |
16e53a2332ee2261d00ddbe1de06f514
|
|
| BLAKE2b-256 |
49c8f59e45a45ea25af242cc3726af2976787074e68101d44f8ae5501163dec0
|
Provenance
The following attestation bundles were made for arize_phoenix_otel-0.16.0.tar.gz:
Publisher:
publish.yaml on Arize-ai/phoenix
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
arize_phoenix_otel-0.16.0.tar.gz -
Subject digest:
9436595f3cdff919d45a8cfd0acbd69b0821f836e913b5279bac50a90be832c2 - Sigstore transparency entry: 1372662933
- Sigstore integration time:
-
Permalink:
Arize-ai/phoenix@4f360a92de817a33b37f06dbf65ea0cb08f7e46d -
Branch / Tag:
refs/heads/main - Owner: https://github.com/Arize-ai
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yaml@4f360a92de817a33b37f06dbf65ea0cb08f7e46d -
Trigger Event:
workflow_dispatch
-
Statement type:
File details
Details for the file arize_phoenix_otel-0.16.0-py3-none-any.whl.
File metadata
- Download URL: arize_phoenix_otel-0.16.0-py3-none-any.whl
- Upload date:
- Size: 18.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c3c455cccb583d25f1976ad56f973e12506eec9d86f2c35f2bd6c17ccfaa9943
|
|
| MD5 |
29c67411a84abfd8a50da04e4bac754f
|
|
| BLAKE2b-256 |
436f593f8df242ff66e3b908ce9117edde0b5ae1f624704283e12256bbb6ad25
|
Provenance
The following attestation bundles were made for arize_phoenix_otel-0.16.0-py3-none-any.whl:
Publisher:
publish.yaml on Arize-ai/phoenix
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
arize_phoenix_otel-0.16.0-py3-none-any.whl -
Subject digest:
c3c455cccb583d25f1976ad56f973e12506eec9d86f2c35f2bd6c17ccfaa9943 - Sigstore transparency entry: 1372662983
- Sigstore integration time:
-
Permalink:
Arize-ai/phoenix@4f360a92de817a33b37f06dbf65ea0cb08f7e46d -
Branch / Tag:
refs/heads/main - Owner: https://github.com/Arize-ai
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yaml@4f360a92de817a33b37f06dbf65ea0cb08f7e46d -
Trigger Event:
workflow_dispatch
-
Statement type: