Skip to main content

OpenTelemetry instrumentation for Llama Index

Project description

Llama Index OpenTelemetry Integration

Overview

This integration provides support for using OpenTelemetry with the Llama Index framework. It enables tracing and monitoring of applications built with Llama Index.

Installation

  1. Install traceAI Llama Index
pip install traceAI-llamaindex
  1. Install Llama Index
pip install llama-index

Set Environment Variables

Set up your environment variables to authenticate with FutureAGI

import os

os.environ["FI_API_KEY"] = FI_API_KEY
os.environ["FI_SECRET_KEY"] = FI_SECRET_KEY

Quickstart

Register Tracer Provider

Set up the trace provider to establish the observability pipeline. The trace provider:

from fi_instrumentation import register
from fi_instrumentation.fi_types import ProjectType

trace_provider = register(
    project_type=ProjectType.OBSERVE,
    project_name="llama_index_app"
)

Configure Llama Index Instrumentation

Instrument the Llama Index client to enable telemetry collection. This step ensures that all interactions with the Llama Index SDK are tracked and monitored.

from traceai_llamaindex import LlamaIndexInstrumentor

LlamaIndexInstrumentor().instrument(tracer_provider=trace_provider)

Create Llama Index Components

Set up your Llama Index client with built-in observability.

from llama_index.agent.openai import OpenAIAgent
from llama_index.core import Settings
from llama_index.core.tools import FunctionTool
from llama_index.llms.openai import OpenAI

def multiply(a: int, b: int) -> int:
    """Multiply two integers and return the result."""
    return a * b

def add(a: int, b: int) -> int:
    """Add two integers and return the result."""
    return a + b

multiply_tool = FunctionTool.from_defaults(fn=multiply)
add_tool = FunctionTool.from_defaults(fn=add)
agent = OpenAIAgent.from_tools([multiply_tool, add_tool])
Settings.llm = OpenAI(model="gpt-3.5-turbo")

if __name__ == "__main__":
    response = agent.query("What is (121 * 3) + 42?")
    print(response)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

traceai_llamaindex-0.1.9.tar.gz (19.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

traceai_llamaindex-0.1.9-py3-none-any.whl (21.0 kB view details)

Uploaded Python 3

File details

Details for the file traceai_llamaindex-0.1.9.tar.gz.

File metadata

  • Download URL: traceai_llamaindex-0.1.9.tar.gz
  • Upload date:
  • Size: 19.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.21 {"installer":{"name":"uv","version":"0.9.21","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for traceai_llamaindex-0.1.9.tar.gz
Algorithm Hash digest
SHA256 d73ff92b8c4c3ab1fc894d30db8cc196521c433a991254624c688f7180bce5ba
MD5 648c237ab8543f4d5853c57dde079a43
BLAKE2b-256 a2a08c25202ae98f70d20898df2011e900f7b282854c6098bd537734c4c7a7a2

See more details on using hashes here.

File details

Details for the file traceai_llamaindex-0.1.9-py3-none-any.whl.

File metadata

  • Download URL: traceai_llamaindex-0.1.9-py3-none-any.whl
  • Upload date:
  • Size: 21.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.21 {"installer":{"name":"uv","version":"0.9.21","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for traceai_llamaindex-0.1.9-py3-none-any.whl
Algorithm Hash digest
SHA256 27bb8073e78c5637ef2139c49f3cada6087f03c4e50f6c9f6ec5201913793953
MD5 8174842bfe1a5a032c61d32df655fac0
BLAKE2b-256 42bf57c201693c23c549a36aad823bb2b4554dc294e3903e9ef4ae101e47e924

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page