Skip to main content

OpenTelemetry instrumentation for Llama Index

Project description

Llama Index OpenTelemetry Integration

Overview

This integration provides support for using OpenTelemetry with the Llama Index framework. It enables tracing and monitoring of applications built with Llama Index.

Installation

  1. Install traceAI Llama Index
pip install traceAI-llamaindex
  1. Install Llama Index
pip install llama-index

Set Environment Variables

Set up your environment variables to authenticate with FutureAGI

import os

os.environ["FI_API_KEY"] = FI_API_KEY
os.environ["FI_SECRET_KEY"] = FI_SECRET_KEY

Quickstart

Register Tracer Provider

Set up the trace provider to establish the observability pipeline. The trace provider:

from fi_instrumentation import register
from fi_instrumentation.fi_types import ProjectType

trace_provider = register(
    project_type=ProjectType.OBSERVE,
    project_name="llama_index_app"
)

Configure Llama Index Instrumentation

Instrument the Llama Index client to enable telemetry collection. This step ensures that all interactions with the Llama Index SDK are tracked and monitored.

from traceai_llamaindex import LlamaIndexInstrumentor

LlamaIndexInstrumentor().instrument(tracer_provider=trace_provider)

Create Llama Index Components

Set up your Llama Index client with built-in observability.

from llama_index.agent.openai import OpenAIAgent
from llama_index.core import Settings
from llama_index.core.tools import FunctionTool
from llama_index.llms.openai import OpenAI

def multiply(a: int, b: int) -> int:
    """Multiply two integers and return the result."""
    return a * b

def add(a: int, b: int) -> int:
    """Add two integers and return the result."""
    return a + b

multiply_tool = FunctionTool.from_defaults(fn=multiply)
add_tool = FunctionTool.from_defaults(fn=add)
agent = OpenAIAgent.from_tools([multiply_tool, add_tool])
Settings.llm = OpenAI(model="gpt-3.5-turbo")

if __name__ == "__main__":
    response = agent.query("What is (121 * 3) + 42?")
    print(response)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

traceai_llamaindex-0.1.7.tar.gz (19.3 kB view details)

Uploaded Source

Built Distribution

traceai_llamaindex-0.1.7-py3-none-any.whl (21.1 kB view details)

Uploaded Python 3

File details

Details for the file traceai_llamaindex-0.1.7.tar.gz.

File metadata

  • Download URL: traceai_llamaindex-0.1.7.tar.gz
  • Upload date:
  • Size: 19.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.0.0 CPython/3.13.0 Darwin/24.1.0

File hashes

Hashes for traceai_llamaindex-0.1.7.tar.gz
Algorithm Hash digest
SHA256 cea93b213dea59443aee258e321235d6163f2a636db765f2ba7b521a06a3c90e
MD5 4eedbbdef28915af12f0f61b0bb0f3b5
BLAKE2b-256 a3133dd8d1b69eacc79957c79c32a468c900e4381318960f88409e23eb39d56c

See more details on using hashes here.

File details

Details for the file traceai_llamaindex-0.1.7-py3-none-any.whl.

File metadata

  • Download URL: traceai_llamaindex-0.1.7-py3-none-any.whl
  • Upload date:
  • Size: 21.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.0.0 CPython/3.13.0 Darwin/24.1.0

File hashes

Hashes for traceai_llamaindex-0.1.7-py3-none-any.whl
Algorithm Hash digest
SHA256 9915757d71ed2329d8c3a1e21e27cb027b44d3b679215e6cf1f15ff6bde51198
MD5 4a7082f411630cf19c9a7d89b80cd1c1
BLAKE2b-256 7cb2703cde6f426bc722900d2a5e1ff80e3e235c32fbee425bd24169468c1d6d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page