Skip to main content

LLM Observability

Project description

Arize Phoenix logo
arize-phoenix-client

Phoenix Client is a lightweight package for interacting with the Phoenix server.

pypi

Features

  • API - Interact with Phoenix's OpenAPI REST interface
  • Prompt Management - Pull / push / and invoke prompts stored in Phoenix

Installation

Install via pip.

pip install -Uq arize-phoenix-client

Usage

from phoenix.client import Client

client = Client(base_url="your-server-url")  # base_url defaults to http://localhost:6006

Authentication (if applicable)

Phoenix API key can be an environment variable...

import os

os.environ["PHOENIX_API_KEY"] = "your-api-key"

...or passed directly to the client.

from phoenix.client import Client

client = Client(api_key="your-api-key")

Custom Headers

By default, the Phoenix client will use the bearer authentication scheme in the HTTP headers, but if you need different headers, e.g. for Phoenix Cloud, they can also be customized via an environment variable...

import os

os.environ["PHOENIX_CLIENT_HEADERS"] = "api-key=your-api-key,"  # use `api-key` for Phoenix Cloud

...or passed directly to the client.

from phoenix.client import Client

client = Client(headers={"api-key": "your-api-key"})  # use `api-key` for Phoenix Cloud

Prompt Management

With the Phoenix client, you can push and pull prompts to and from your Phoenix server.

from phoenix.client import Client
from phoenix.client.types import PromptVersion

# Change base_url to your Phoenix server URL
base_url = "http://localhost:6006"
client = Client(base_url=base_url)

# prompt identifier consists of alphanumeric characters, hyphens or underscores
prompt_identifier = "haiku-writer"

content = "Write a haiku about {{topic}}"
prompt = client.prompts.create(
    name=prompt_identifier,
    version=PromptVersion(
        [{"role": "user", "content": content}],
        model_name="gpt-4o-mini",
    ),
)

The client can retrieve a prompt by its name.

prompt = client.prompts.get(prompt_identifier=prompt_identifier)

The prompt can be used to generate completions.

from openai import OpenAI

variables = {"topic": "programming"}
resp = OpenAI().chat.completions.create(**prompt.format(variables=variables))
print(resp.choices[0].message.content)

To learn more about prompt engineering using Phenix, see the Phoenix documentation.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

arize_phoenix_client-1.12.0.tar.gz (64.9 kB view details)

Uploaded Source

Built Distribution

arize_phoenix_client-1.12.0-py3-none-any.whl (65.3 kB view details)

Uploaded Python 3

File details

Details for the file arize_phoenix_client-1.12.0.tar.gz.

File metadata

  • Download URL: arize_phoenix_client-1.12.0.tar.gz
  • Upload date:
  • Size: 64.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.5

File hashes

Hashes for arize_phoenix_client-1.12.0.tar.gz
Algorithm Hash digest
SHA256 41048e8451a69c4b02386ccce908570519dcec375dc55f3348c2bb666b783cd0
MD5 fd0053c5dbcc3e9ecd073b81111a7b6e
BLAKE2b-256 ba7d11a4bff0b7a107ba8f2fdc6959bca019d5daa6bcc0d0035c08e24c8c75cc

See more details on using hashes here.

File details

Details for the file arize_phoenix_client-1.12.0-py3-none-any.whl.

File metadata

File hashes

Hashes for arize_phoenix_client-1.12.0-py3-none-any.whl
Algorithm Hash digest
SHA256 45277e551c82c2bd2fe926e20af2bbb803ef3041ddb1ea5cad27da5df0d42882
MD5 094da5b6d64d927c2bf17a6c09a5c9d3
BLAKE2b-256 06bbb1ac38d33745f4b25602a7360c74bc1fc567f61b81d12ab9e83e66ded475

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page