ATI LlamaIndex integration: retrieval/synthesis callbacks -> ATI spans
Project description
ATI Integration for LlamaIndex
This package provides OpenTelemetry instrumentation for LlamaIndex applications using Iocane ATI.
It captures:
- LLM Calls: Requests to backend models.
- Retrieval: Vector store lookups and query engine operations.
- Synthesizing: Response generation steps.
Installation
pip install ati-integrations-llamaindex opentelemetry-sdk opentelemetry-exporter-otlp
Configuration
Set the standard OpenTelemetry environment variables to point to your Iocane collector:
export OTEL_EXPORTER_OTLP_ENDPOINT="https://api.iocane.ai/v1/traces"
export OTEL_EXPORTER_OTLP_HEADERS="x-iocane-key=YOUR_KEY,x-ati-env=YOUR_ENV_ID"
export OTEL_SERVICE_NAME="my-llamaindex-app"
Usage
Here is the robust pattern for instrumenting LlamaIndex applications.
Important: LlamaIndex creates many internal components. It is crucial to initialize OpenTelemetry before importing or initializing heavily dependent LlamaIndex modules if possible, and ensuring the global tracer provider is correctly set.
import os
from ati_llamaindex import LlamaIndexInstrumentor
from llama_index.core import VectorStoreIndex, SimpleDirectoryReader
# OpenTelemetry Imports
from opentelemetry import trace
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.resources import Resource, SERVICE_NAME
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
def main():
# 1. Configure OpenTelemetry (Robust Pattern)
resource = Resource.create(attributes={SERVICE_NAME: "my-llamaindex-service"})
try:
# Try to set the global provider
provider = TracerProvider(resource=resource)
trace.set_tracer_provider(provider)
except Exception:
# If it fails (e.g., already set), ignore and fetch the active one below
pass
# ALWAYS get the global provider to ensure we attach to the active pipeline
provider = trace.get_tracer_provider()
# 2. Configure Exporter (Iocane)
# Ensure usage of the correct endpoint from env or default
endpoint = os.environ.get("OTEL_EXPORTER_OTLP_ENDPOINT")
if endpoint and endpoint.endswith("/v1/traces"):
exporter = OTLPSpanExporter(endpoint=endpoint)
else:
exporter = OTLPSpanExporter()
if hasattr(provider, "add_span_processor"):
provider.add_span_processor(BatchSpanProcessor(exporter))
else:
print("WARNING: TracerProvider does not support add_span_processor.")
# 3. Instrument LlamaIndex
# This should be done before creating indices or query engines
LlamaIndexInstrumentor().instrument()
try:
# 4. Your LlamaIndex Code
# documents = SimpleDirectoryReader("data").load_data()
# index = VectorStoreIndex.from_documents(documents)
# query_engine = index.as_query_engine()
# response = query_engine.query("What represents this data?")
# print(response)
pass
finally:
# 5. Flush traces
LlamaIndexInstrumentor().uninstrument()
if hasattr(provider, "shutdown"):
provider.shutdown()
if __name__ == "__main__":
main()
Environment Variables for Instrumentation
| Variable | Description | Default |
|---|---|---|
ATI_CAPTURE_PAYLOADS |
set to true to capture query text and response content as span events |
false |
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file ati_integrations_llamaindex-0.1.1.tar.gz.
File metadata
- Download URL: ati_integrations_llamaindex-0.1.1.tar.gz
- Upload date:
- Size: 6.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2875919260fd67628beecfc8a84097e4731a595e2c578f1413109c211618d2dd
|
|
| MD5 |
c95f1079f982d849187f3ad68d0869b6
|
|
| BLAKE2b-256 |
2856898348bd4e90efc6c843cd0276168b40e99d9eeb7b0336e367ec255c1ecd
|
File details
Details for the file ati_integrations_llamaindex-0.1.1-py3-none-any.whl.
File metadata
- Download URL: ati_integrations_llamaindex-0.1.1-py3-none-any.whl
- Upload date:
- Size: 4.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
138828dc37d90e8fcb19692fec97a131b7707c92f2e4da0eae9ed70ecf9e1f8e
|
|
| MD5 |
5c1348907205bef7fca1a8b7c364eca7
|
|
| BLAKE2b-256 |
d9ef620cd63cc0ab706aa30cb18967002c779820a89df3af5d1cbefd925ffb81
|