Skip to main content

llama-index packs arize_phoenix_query_engine integration

Project description

Arize-Phoenix LlamaPack

This LlamaPack instruments your LlamaIndex app for LLM tracing with Phoenix, an open-source LLM observability library from Arize AI.

CLI Usage

You can download llamapacks directly using llamaindex-cli, which comes installed with the llama-index python package:

llamaindex-cli download-llamapack ArizePhoenixQueryEnginePack --download-dir ./arize_pack

You can then inspect the files at ./arize_pack and use them as a template for your own project!

Code Usage

You can download the pack to a the ./arize_pack directory:

from llama_index.core.llama_pack import download_llama_pack

# download and install dependencies
ArizePhoenixQueryEnginePack = download_llama_pack(
    "ArizePhoenixQueryEnginePack", "./arize_pack"
)

You can then inspect the files at ./arize_pack or continue on to use the module.

import os

from llama_index.core.node_parser import SentenceSplitter
from llama_index.readers.web import SimpleWebPageReader
from tqdm.auto import tqdm

Configure your OpenAI API key.

os.environ["OPENAI_API_KEY"] = "copy-your-openai-api-key-here"

Parse your documents into a list of nodes and pass to your LlamaPack. In this example, use nodes from a Paul Graham essay as input.

documents = SimpleWebPageReader().load_data(
    [
        "https://raw.githubusercontent.com/jerryjliu/llama_index/adb054429f642cc7bbfcb66d4c232e072325eeab/examples/paul_graham_essay/data/paul_graham_essay.txt"
    ]
)
parser = SentenceSplitter()
nodes = parser.get_nodes_from_documents(documents)
phoenix_pack = ArizePhoenixQueryEnginePack(nodes=nodes)

Run a set of queries via the pack's run method, which delegates to the underlying query engine.

queries = [
    "What did Paul Graham do growing up?",
    "When and how did Paul Graham's mother die?",
    "What, in Paul Graham's opinion, is the most distinctive thing about YC?",
    "When and how did Paul Graham meet Jessica Livingston?",
    "What is Bel, and when and where was it written?",
]
for query in tqdm(queries):
    print("Query")
    print("=====")
    print(query)
    print()
    response = phoenix_pack.run(query)
    print("Response")
    print("========")
    print(response)
    print()

View your trace data in the Phoenix UI.

phoenix_session_url = phoenix_pack.get_modules()["session_url"]
print(f"Open the Phoenix UI to view your trace data: {phoenix_session_url}")

You can access the internals of the LlamaPack, including your Phoenix session and your query engine, via the get_modules method.

phoenix_pack.get_modules()

Check out the Phoenix documentation for more information!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file llama_index_packs_arize_phoenix_query_engine-0.4.1.tar.gz.

File metadata

File hashes

Hashes for llama_index_packs_arize_phoenix_query_engine-0.4.1.tar.gz
Algorithm Hash digest
SHA256 aab98df12b571afa42ce8d5fa119fea94d21674b002be6e731abe8514c789cfb
MD5 df5468038fb16da13a2af8da1e15ea2b
BLAKE2b-256 63d3eeb61d3b6a6a6ab9bf356471a89d9f09a4d4ec9e478c4dc4df7c60e98178

See more details on using hashes here.

File details

Details for the file llama_index_packs_arize_phoenix_query_engine-0.4.1-py3-none-any.whl.

File metadata

File hashes

Hashes for llama_index_packs_arize_phoenix_query_engine-0.4.1-py3-none-any.whl
Algorithm Hash digest
SHA256 3d73144f0c93a6f627d1fc968461a44e1a453ea56e42cee96a57229d84e4fc30
MD5 5c7a394ef79ff968c25020845e6d3538
BLAKE2b-256 0911ffa31c873e61d787c6cba3b3daa572baa2ae944913f0a1be1208cd99614d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page