Skip to main content

LLM Observability

Project description

Arize Phoenix logo
arize-phoenix-client

Phoenix Client is a lightweight package for interacting with the Phoenix server.

pypi

Features

  • API - Interact with Phoenix's OpenAPI REST interface
  • Prompt Management - Pull / push / and invoke prompts stored in Phoenix

Installation

Install via pip.

pip install -Uq arize-phoenix-client

Usage

from phoenix.client import Client

client = Client(base_url="your-server-url")  # base_url defaults to http://localhost:6006

Authentication (if applicable)

Phoenix API key can be an environment variable...

import os

os.environ["PHOENIX_API_KEY"] = "your-api-key"

...or passed directly to the client.

from phoenix.client import Client

client = Client(api_key="your-api-key")

Custom Headers

By default, the Phoenix client will use the bearer authentication scheme in the HTTP headers, but if you need different headers, e.g. for Phoenix Cloud, they can also be customized via an environment variable...

import os

os.environ["PHOENIX_CLIENT_HEADERS"] = "api-key=your-api-key,"  # use `api-key` for Phoenix Cloud

...or passed directly to the client.

from phoenix.client import Client

client = Client(headers={"api-key": "your-api-key"})  # use `api-key` for Phoenix Cloud

Prompt Management

With the Phoenix client, you can push and pull prompts to and from your Phoenix server.

from phoenix.client import Client
from phoenix.client.types import PromptVersion

# Change base_url to your Phoenix server URL
base_url = "http://localhost:6006"
client = Client(base_url=base_url)

# prompt identifier consists of alphanumeric characters, hyphens or underscores
prompt_identifier = "haiku-writer"

content = "Write a haiku about {{topic}}"
prompt = client.prompts.create(
    name=prompt_identifier,
    version=PromptVersion(
        [{"role": "user", "content": content}],
        model_name="gpt-4o-mini",
    ),
)

The client can retrieve a prompt by its name.

prompt = client.prompts.get(prompt_identifier=prompt_identifier)

The prompt can be used to generate completions.

from openai import OpenAI

variables = {"topic": "programming"}
resp = OpenAI().chat.completions.create(**prompt.format(variables=variables))
print(resp.choices[0].message.content)

To learn more about prompt engineering using Phenix, see the Phoenix documentation.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

arize_phoenix_client-1.13.2.tar.gz (95.4 kB view details)

Uploaded Source

Built Distribution

arize_phoenix_client-1.13.2-py3-none-any.whl (99.4 kB view details)

Uploaded Python 3

File details

Details for the file arize_phoenix_client-1.13.2.tar.gz.

File metadata

  • Download URL: arize_phoenix_client-1.13.2.tar.gz
  • Upload date:
  • Size: 95.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.5

File hashes

Hashes for arize_phoenix_client-1.13.2.tar.gz
Algorithm Hash digest
SHA256 b4a88c5eda408cbee3a06f77021667873e28a5635e93995bfa70260b0f4a55f9
MD5 2220e3081c50ab81254034f1e3e4e7ac
BLAKE2b-256 c5483f64c4eaa5a579257928441dd4677974e14c10f69afc290de301447ae041

See more details on using hashes here.

File details

Details for the file arize_phoenix_client-1.13.2-py3-none-any.whl.

File metadata

File hashes

Hashes for arize_phoenix_client-1.13.2-py3-none-any.whl
Algorithm Hash digest
SHA256 398e4aa02789ff735f607c079d5c9da5919d0aee83012d4cb38b4e480d95e0e8
MD5 6077546d70dd37eff6dc4e9d2615d616
BLAKE2b-256 cc74a2a3830b531579896b4f3179475f8ce2dab02b109035ae671429682fd1a3

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page