Skip to main content

LLM Observability

Project description

arize-phoenix-client

Phoenix Client is a lightweight package for interacting with the Phoenix server via its OpenAPI REST interface.

pypi

Installation

Install via pip.

pip install -Uq arize-phoenix-client

Usage

from phoenix.client import Client

client = Client(base_url="your-server-url")  # base_url defaults to http://localhost:6006

Authentication (if applicable)

Phoenix API key can be an environment variable...

import os

os.environ["PHOENIX_API_KEY"] = "your-api-key"

...or passed directly to the client.

from phoenix.client import Client

client = Client(api_key="your-api-key")

Custom Headers

By default, the Phoenix client will use the bearer authentication scheme in the HTTP headers, but if you need different headers, e.g. for Phoenix Cloud, they can also be customized via an environment variable...

import os

os.environ["PHOENIX_CLIENT_HEADERS"] = "api-key=your-api-key,"  # use `api-key` for Phoenix Cloud

...or passed directly to the client.

from phoenix.client import Client

client = Client(headers={"api-key": "your-api-key"})  # use `api-key` for Phoenix Cloud

Prompt Management

With the Phoenix client, you can push and pull prompts to and from your Phoenix server.

from phoenix.client import Client
from phoenix.client.types import PromptVersion

# Change base_url to your Phoenix server URL
base_url = "http://localhost:6006"
client = Client(base_url=base_url)

# prompt identifier consists of alphanumeric characters, hyphens or underscores
prompt_identifier = "haiku-writer"

content = "Write a haiku about {{topic}}"
prompt = client.prompts.create(
    name=prompt_identifier,
    version=PromptVersion(
        [{"role": "user", "content": content}],
        model_name="gpt-4o-mini",
    ),
)

The client can retrieve a prompt by its name.

prompt = client.prompts.get(prompt_identifier=prompt_identifier)

The prompt can be used to generate completions.

from openai import OpenAI

variables = {"topic": "programming"}
resp = OpenAI().chat.completions.create(**prompt.format(variables=variables))
print(resp.choices[0].message.content)

To learn more about prompt engineering using Phenix, see the Phoenix documentation.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

arize_phoenix_client-1.0.3.tar.gz (29.5 kB view details)

Uploaded Source

Built Distribution

arize_phoenix_client-1.0.3-py3-none-any.whl (32.3 kB view details)

Uploaded Python 3

File details

Details for the file arize_phoenix_client-1.0.3.tar.gz.

File metadata

  • Download URL: arize_phoenix_client-1.0.3.tar.gz
  • Upload date:
  • Size: 29.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.2

File hashes

Hashes for arize_phoenix_client-1.0.3.tar.gz
Algorithm Hash digest
SHA256 4fd6766641803acea58abf3e2eb36cac6753f10b00b11736a0ed7c12cf07acfb
MD5 62e740e957756a7280bf6fefe6e52b82
BLAKE2b-256 f91ded50c40aaae97a9a2c58f95450effe617056dd456b27d39663a8233c2083

See more details on using hashes here.

File details

Details for the file arize_phoenix_client-1.0.3-py3-none-any.whl.

File metadata

File hashes

Hashes for arize_phoenix_client-1.0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 d095a1515107a8f3dbcbe550fc9cd292e53abc8716e4cfdfa08c89798eb7fdbf
MD5 dda4f1543bd58ae7aa1e325a8f3467a7
BLAKE2b-256 8f42e49dc8015645c1083c381cb4a7d0e7012e6668be14975414d0b782fc9398

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page