Skip to main content

AI Observability and Evaluation

Project description

phoenix banner

Add Arize Phoenix MCP server to Cursor

Phoenix is an open-source AI observability platform designed for experimentation, evaluation, and troubleshooting. It provides:

  • Tracing - Trace your LLM application's runtime using OpenTelemetry-based instrumentation.
  • Evaluation - Leverage LLMs to benchmark your application's performance using response and retrieval evals.
  • Datasets - Create versioned datasets of examples for experimentation, evaluation, and fine-tuning.
  • Experiments - Track and evaluate changes to prompts, LLMs, and retrieval.
  • Playground- Optimize prompts, compare models, adjust parameters, and replay traced LLM calls.
  • Prompt Management- Manage and test prompt changes systematically using version control, tagging, and experimentation.

Phoenix is vendor and language agnostic with out-of-the-box support for popular frameworks (🦙LlamaIndex, 🦜⛓LangChain, Haystack, 🧩DSPy, 🤗smolagents) and LLM providers (OpenAI, Bedrock, MistralAI, VertexAI, LiteLLM, Google GenAI and more). For details on auto-instrumentation, check out the OpenInference project.

Phoenix runs practically anywhere, including your local machine, a Jupyter notebook, a containerized deployment, or in the cloud.

Installation

Install Phoenix via pip or conda

pip install arize-phoenix

Phoenix container images are available via Docker Hub and can be deployed using Docker or Kubernetes. Arize AI also provides cloud instances at app.phoenix.arize.com.

Packages

The arize-phoenix package includes the entire Phoenix platfom. However if you have deployed the Phoenix platform, there are light-weight Python sub-packages and TypeScript packages that can be used in conjunction with the platfrom.

Python Subpackages

Package Version & Docs Description
arize-phoenix-otel PyPI Version Docs Provides a lightweight wrapper around OpenTelemetry primitives with Phoenix-aware defaults
arize-phoenix-client PyPI Version Docs Lightweight client for interacting with the Phoenix server via its OpenAPI REST interface
arize-phoenix-evals PyPI Version Docs Tooling to evaluate LLM applications including RAG relevance, answer relevance, and more

TypeScript Subpackages

Package Version & Docs Description
@arizeai/phoenix-otel NPM Version Docs Provides a lightweight wrapper around OpenTelemetry primitives with Phoenix-aware defaults
@arizeai/phoenix-client NPM Version Docs Client for the Arize Phoenix API
@arizeai/phoenix-evals NPM Version Docs TypeScript evaluation library for LLM applications (alpha release)
@arizeai/phoenix-mcp NPM Version Docs MCP server implementation for Arize Phoenix providing unified interface to Phoenix's capabilities
@arizeai/phoenix-cli NPM Version Docs CLI for fetching traces, datasets, and experiments for use with Claude Code, Cursor, and other coding agents

Tracing Integrations

Phoenix is built on top of OpenTelemetry and is vendor, language, and framework agnostic. For details about tracing integrations and example applications, see the OpenInference project.

Python Integrations

Integration Package Version Badge
OpenAI openinference-instrumentation-openai PyPI Version
OpenAI Agents openinference-instrumentation-openai-agents PyPI Version
LlamaIndex openinference-instrumentation-llama-index PyPI Version
DSPy openinference-instrumentation-dspy PyPI Version
AWS Bedrock openinference-instrumentation-bedrock PyPI Version
LangChain openinference-instrumentation-langchain PyPI Version
MistralAI openinference-instrumentation-mistralai PyPI Version
Google GenAI openinference-instrumentation-google-genai PyPI Version
Google ADK openinference-instrumentation-google-adk PyPI Version
Guardrails openinference-instrumentation-guardrails PyPI Version
VertexAI openinference-instrumentation-vertexai PyPI Version
CrewAI openinference-instrumentation-crewai PyPI Version
Haystack openinference-instrumentation-haystack PyPI Version
LiteLLM openinference-instrumentation-litellm PyPI Version
Groq openinference-instrumentation-groq PyPI Version
Instructor openinference-instrumentation-instructor PyPI Version
Anthropic openinference-instrumentation-anthropic PyPI Version
Smolagents openinference-instrumentation-smolagents PyPI Version
Agno openinference-instrumentation-agno PyPI Version
MCP openinference-instrumentation-mcp PyPI Version
Pydantic AI openinference-instrumentation-pydantic-ai PyPI Version
Autogen AgentChat openinference-instrumentation-autogen-agentchat PyPI Version
Portkey openinference-instrumentation-portkey PyPI Version

Span Processors

Normalize and convert data across other instrumentation libraries by adding span processors that unify data.

Package Description Version
openinference-instrumentation-openlit OpenInference Span Processor for OpenLIT traces. PyPI Version
openinference-instrumentation-openllmetry OpenInference Span Processor for OpenLLMetry (Traceloop) traces. PyPI Version

JavaScript Integrations

Integration Package Version Badge
OpenAI @arizeai/openinference-instrumentation-openai NPM Version
LangChain.js @arizeai/openinference-instrumentation-langchain NPM Version
Vercel AI SDK @arizeai/openinference-vercel NPM Version
BeeAI @arizeai/openinference-instrumentation-beeai NPM Version
Mastra @mastra/arize NPM Version

Java Integrations

Integration Package Version Badge
LangChain4j openinference-instrumentation-langchain4j Maven Central
SpringAI openinference-instrumentation-springAI Maven Central

Platforms

Platform Description Docs
BeeAI AI agent framework with built-in observability Integration Guide
Dify Open-source LLM app development platform Integration Guide
Envoy AI Gateway AI Gateway built on Envoy Proxy for AI workloads Integration Guide
LangFlow Visual framework for building multi-agent and RAG applications Integration Guide
LiteLLM Proxy Proxy server for LLMs Integration Guide

Security & Privacy

We take data security and privacy very seriously. For more details, see our Security and Privacy documentation.

Telemetry

By default, Phoenix collects basic web analytics (e.g., page views, UI interactions) to help us understand how Phoenix is used and improve the product. None of your trace data, evaluation results, or any sensitive information is ever collected.

You can opt-out of telemetry by setting the environment variable: PHOENIX_TELEMETRY_ENABLED=false

Community

Join our community to connect with thousands of AI builders.

Breaking Changes

See the migration guide for a list of breaking changes.

Copyright, Patent, and License

Copyright 2025 Arize AI, Inc. All Rights Reserved.

Portions of this code are patent protected by one or more U.S. Patents. See the IP_NOTICE.

This software is licensed under the terms of the Elastic License 2.0 (ELv2). See LICENSE.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

arize_phoenix-12.35.0.tar.gz (2.4 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

arize_phoenix-12.35.0-py3-none-any.whl (2.6 MB view details)

Uploaded Python 3

File details

Details for the file arize_phoenix-12.35.0.tar.gz.

File metadata

  • Download URL: arize_phoenix-12.35.0.tar.gz
  • Upload date:
  • Size: 2.4 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for arize_phoenix-12.35.0.tar.gz
Algorithm Hash digest
SHA256 0833c4b478ebb7b02d7f74f04f6bbe6d68ed7cebf2d22baffc5eb4aaf4358002
MD5 f376d917ac259ad8819a6fd8ebcb7249
BLAKE2b-256 5c7fe2a298879e5b40efc8860c480dd9f79fb59e1495729828244235e4682266

See more details on using hashes here.

Provenance

The following attestation bundles were made for arize_phoenix-12.35.0.tar.gz:

Publisher: release.yml on Arize-ai/phoenix

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file arize_phoenix-12.35.0-py3-none-any.whl.

File metadata

File hashes

Hashes for arize_phoenix-12.35.0-py3-none-any.whl
Algorithm Hash digest
SHA256 2dc5b214d82be151718b8089440f83ad4c8e59d9097e3459b458e071bf655995
MD5 d187292b84daa3ebaf98611f16783ef9
BLAKE2b-256 d4c43c1c839297a6b4e967d49cb4139454f329a6453589b04b457b1a95a665b9

See more details on using hashes here.

Provenance

The following attestation bundles were made for arize_phoenix-12.35.0-py3-none-any.whl:

Publisher: release.yml on Arize-ai/phoenix

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page