Skip to main content

OpenTelemetry instrumentation for Llama Index

Project description

Llama Index OpenTelemetry Integration

Overview

This integration provides support for using OpenTelemetry with the Llama Index framework. It enables tracing and monitoring of applications built with Llama Index.

Installation

  1. Install traceAI Llama Index
pip install traceAI-llamaindex
  1. Install Llama Index
pip install llama-index

Set Environment Variables

Set up your environment variables to authenticate with FutureAGI

import os

os.environ["FI_API_KEY"] = FI_API_KEY
os.environ["FI_SECRET_KEY"] = FI_SECRET_KEY

Quickstart

Register Tracer Provider

Set up the trace provider to establish the observability pipeline. The trace provider:

from fi_instrumentation import register
from fi_instrumentation.fi_types import ProjectType

trace_provider = register(
    project_type=ProjectType.OBSERVE,
    project_name="llama_index_app"
)

Configure Llama Index Instrumentation

Instrument the Llama Index client to enable telemetry collection. This step ensures that all interactions with the Llama Index SDK are tracked and monitored.

from traceai_llamaindex import LlamaIndexInstrumentor

LlamaIndexInstrumentor().instrument(tracer_provider=trace_provider)

Create Llama Index Components

Set up your Llama Index client with built-in observability.

from llama_index.agent.openai import OpenAIAgent
from llama_index.core import Settings
from llama_index.core.tools import FunctionTool
from llama_index.llms.openai import OpenAI

def multiply(a: int, b: int) -> int:
    """Multiply two integers and return the result."""
    return a * b

def add(a: int, b: int) -> int:
    """Add two integers and return the result."""
    return a + b

multiply_tool = FunctionTool.from_defaults(fn=multiply)
add_tool = FunctionTool.from_defaults(fn=add)
agent = OpenAIAgent.from_tools([multiply_tool, add_tool])
Settings.llm = OpenAI(model="gpt-3.5-turbo")

if __name__ == "__main__":
    response = agent.query("What is (121 * 3) + 42?")
    print(response)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

traceai_llamaindex-0.1.8.tar.gz (19.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

traceai_llamaindex-0.1.8-py3-none-any.whl (21.0 kB view details)

Uploaded Python 3

File details

Details for the file traceai_llamaindex-0.1.8.tar.gz.

File metadata

  • Download URL: traceai_llamaindex-0.1.8.tar.gz
  • Upload date:
  • Size: 19.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.0.0 CPython/3.13.0 Darwin/24.1.0

File hashes

Hashes for traceai_llamaindex-0.1.8.tar.gz
Algorithm Hash digest
SHA256 5a261c89d951faad3b48adf1c7fe44298defe755fb4303620d2dbc79412ec86d
MD5 68f00d66cd0992fe9ed8f20aa58f908e
BLAKE2b-256 a4f72c6fa36cfd9bbce74022f4e9162aadec138a1e9cd0042c7e4f713549d8a8

See more details on using hashes here.

File details

Details for the file traceai_llamaindex-0.1.8-py3-none-any.whl.

File metadata

  • Download URL: traceai_llamaindex-0.1.8-py3-none-any.whl
  • Upload date:
  • Size: 21.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.0.0 CPython/3.13.0 Darwin/24.1.0

File hashes

Hashes for traceai_llamaindex-0.1.8-py3-none-any.whl
Algorithm Hash digest
SHA256 5af23535a56d9fe0e2c1bce8414590f635ba6a018d40ad7ffdfa5a1337c88eb5
MD5 b5c769701db7b1d973fadb628f0b618c
BLAKE2b-256 9dd6c1f3f236ae04f7c087c1300386c2ba0dec91b6998adca45a21381a21804a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page