Skip to main content

LLM Observability

Project description

arize-phoenix-client

Phoenix Client is a lightweight package for interacting with the Phoenix server via its OpenAPI REST interface.

pypi

Installation

Install via pip.

pip install -Uq arize-phoenix-client

Usage

from phoenix.client import Client

client = Client(base_url="your-server-url")  # base_url defaults to http://localhost:6006

Authentication (if applicable)

Phoenix API key can be an environment variable...

import os

os.environ["PHOENIX_API_KEY"] = "your-api-key"

...or passed directly to the client.

from phoenix.client import Client

client = Client(api_key="your-api-key")

Custom Headers

By default, the Phoenix client will use the bearer authentication scheme in the HTTP headers, but if you need different headers, e.g. for Phoenix Cloud, they can also be customized via an environment variable...

import os

os.environ["PHOENIX_CLIENT_HEADERS"] = "api-key=your-api-key,"  # use `api-key` for Phoenix Cloud

...or passed directly to the client.

from phoenix.client import Client

client = Client(headers={"api-key": "your-api-key"})  # use `api-key` for Phoenix Cloud

Prompt Management

With the Phoenix client, you can push and pull prompts to and from your Phoenix server.

from phoenix.client import Client
from phoenix.client.types import PromptVersion

# Change base_url to your Phoenix server URL
base_url = "http://localhost:6006"
client = Client(base_url=base_url)

# prompt identifier consists of alphanumeric characters, hyphens or underscores
prompt_identifier = "haiku-writer"

content = "Write a haiku about {{topic}}"
prompt = client.prompts.create(
    name=prompt_identifier,
    version=PromptVersion(
        [{"role": "user", "content": content}],
        model_name="gpt-4o-mini",
    ),
)

The client can retrieve a prompt by its name.

prompt = client.prompts.get(prompt_identifier=prompt_identifier)

The prompt can be used to generate completions.

from openai import OpenAI

variables = {"topic": "programming"}
resp = OpenAI().chat.completions.create(**prompt.format(variables=variables))
print(resp.choices[0].message.content)

To learn more about prompt engineering using Phenix, see the Phoenix documentation.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

arize_phoenix_client-1.1.0.tar.gz (29.7 kB view details)

Uploaded Source

Built Distribution

arize_phoenix_client-1.1.0-py3-none-any.whl (32.5 kB view details)

Uploaded Python 3

File details

Details for the file arize_phoenix_client-1.1.0.tar.gz.

File metadata

  • Download URL: arize_phoenix_client-1.1.0.tar.gz
  • Upload date:
  • Size: 29.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.2

File hashes

Hashes for arize_phoenix_client-1.1.0.tar.gz
Algorithm Hash digest
SHA256 c1d158a6080178f9c1c4b5019a0781a2cc2d7c86b80170ebafe990a9c43555d4
MD5 b83af6f5f1696d3cad0cd7690668c48b
BLAKE2b-256 229e1eb3f989b302dde3267849cda0e22cba3e0df1a335b11cc3a2e71952b0d6

See more details on using hashes here.

File details

Details for the file arize_phoenix_client-1.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for arize_phoenix_client-1.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 60f2081ed48e364eff80117aaa48f5ceb56d1f0a77d266e3b771f55d951fc28d
MD5 8714d9f0a4c0b8088d4605a2dbde45fe
BLAKE2b-256 91392f2febce07e99dfbc31b4b0beb4eef7513d9419f7742204735ba18d541dd

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page