OpenTelemetry instrumentation
Project description
traceAI Python SDK
The Python SDK for traceAI provides OpenTelemetry-native instrumentation for AI applications.
Installation
Install the core instrumentation library and your framework of choice:
# Core library (required)
pip install fi-instrumentation
# Framework-specific instrumentation
pip install traceai-openai # For OpenAI
pip install traceai-anthropic # For Anthropic
pip install traceai-langchain # For LangChain
pip install traceai-llamaindex # For LlamaIndex
# ... see full list below
Quick Start
import os
from fi_instrumentation import register
from fi_instrumentation.fi_types import ProjectType
from traceai_openai import OpenAIInstrumentor
import openai
# Set environment variables
os.environ["FI_API_KEY"] = "<your-api-key>"
os.environ["FI_SECRET_KEY"] = "<your-secret-key>"
# Register tracer provider
trace_provider = register(
project_type=ProjectType.OBSERVE,
project_name="my_app"
)
# Instrument your framework
OpenAIInstrumentor().instrument(tracer_provider=trace_provider)
# Use as normal - tracing happens automatically
client = openai.OpenAI()
response = client.chat.completions.create(
model="gpt-4",
messages=[{"role": "user", "content": "Hello!"}]
)
Core Concepts
Project Types
ProjectType.OBSERVE: For production monitoring. Cannot use eval tags.ProjectType.EXPERIMENT: For development/testing. Supports eval tags for AI evaluations.
TraceConfig (Privacy Controls)
Control what data gets captured:
from fi_instrumentation.instrumentation.config import TraceConfig
config = TraceConfig(
hide_inputs=False, # Hide all input values
hide_outputs=False, # Hide all output values
hide_input_messages=False, # Hide input messages only
hide_output_messages=False, # Hide output messages only
hide_input_images=False, # Hide images in inputs
hide_embedding_vectors=False, # Hide embedding vectors
base64_image_max_length=32000, # Truncate large images
)
Or use environment variables:
FI_HIDE_INPUTS=trueFI_HIDE_OUTPUTS=trueFI_HIDE_INPUT_MESSAGES=trueFI_HIDE_OUTPUT_MESSAGES=trueFI_HIDE_INPUT_IMAGES=trueFI_HIDE_EMBEDDING_VECTORS=true
Context Managers
Add metadata to spans:
from fi_instrumentation import using_attributes
with using_attributes(
session_id="session-123",
user_id="user-456",
metadata={"environment": "production"},
tags=["chat", "support"]
):
response = client.chat.completions.create(...)
Available context managers:
using_session(session_id)- Track sessionusing_user(user_id)- Track userusing_metadata(dict)- Add custom metadatausing_tags(list)- Add categorical tagsusing_prompt_template(template, version, variables)- Track prompt variantsusing_attributes(...)- Combined context managersuppress_tracing()- Temporarily disable tracing
Evaluation Tags (Experiments Only)
Run automated evaluations on spans:
from fi_instrumentation import register
from fi_instrumentation.fi_types import (
ProjectType, EvalTag, EvalTagType,
EvalSpanKind, EvalName, ModelChoices
)
eval_tags = [
EvalTag(
type=EvalTagType.OBSERVATION_SPAN,
value=EvalSpanKind.LLM,
eval_name=EvalName.CONTEXT_ADHERENCE,
custom_eval_name="my_context_check",
mapping={
"context": "raw.input",
"output": "raw.output"
},
model=ModelChoices.TURING_SMALL
)
]
trace_provider = register(
project_type=ProjectType.EXPERIMENT,
project_name="my_experiment",
eval_tags=eval_tags
)
Available Evaluations (60+):
| Category | Evaluations |
|---|---|
| Content Quality | CONTEXT_ADHERENCE, COMPLETENESS, GROUNDEDNESS, SUMMARY_QUALITY |
| Safety | TOXICITY, PII, CONTENT_MODERATION, PROMPT_INJECTION |
| Accuracy | FACTUAL_ACCURACY, CONTEXT_RELEVANCE, DETECT_HALLUCINATION |
| Bias | BIAS_DETECTION, NO_RACIAL_BIAS, NO_GENDER_BIAS |
| Format | IS_JSON, IS_CODE, ONE_LINE, CONTAINS_VALID_LINK |
| Similarity | BLEU_SCORE, ROUGE_SCORE, EMBEDDING_SIMILARITY |
Supported Frameworks
LLM Providers
| Package | Framework |
|---|---|
traceai-openai |
OpenAI |
traceai-anthropic |
Anthropic |
traceai-mistralai |
Mistral AI |
traceai-groq |
Groq |
traceai-vertexai |
Google Vertex AI |
traceai-google-genai |
Google Generative AI |
traceai-google-adk |
Google ADK |
traceai-bedrock |
AWS Bedrock |
traceai-litellm |
LiteLLM |
traceai-portkey |
Portkey |
Agent Frameworks
| Package | Framework |
|---|---|
traceai-langchain |
LangChain |
traceai-llamaindex |
LlamaIndex |
traceai-crewai |
CrewAI |
traceai-autogen |
AutoGen |
traceai-openai-agents |
OpenAI Agents |
traceai-smolagents |
Smol Agents |
traceai-dspy |
DSPy |
traceai-haystack |
Haystack |
Tools & Integrations
| Package | Framework |
|---|---|
traceai-instructor |
Instructor |
traceai-guardrails |
Guardrails AI |
traceai-mcp |
Model Context Protocol |
traceai-pipecat |
Pipecat |
traceai-livekit |
LiveKit |
Vector Databases
| Package | Database |
|---|---|
traceai-pinecone |
Pinecone |
traceai-chromadb |
ChromaDB |
traceai-qdrant |
Qdrant |
traceai-weaviate |
Weaviate |
traceai-milvus |
Milvus |
traceai-lancedb |
LanceDB |
traceai-mongodb |
MongoDB Atlas Vector |
traceai-pgvector |
pgvector |
traceai-redis |
Redis Vector |
Environment Variables
Authentication
FI_API_KEY- API key for Future AGIFI_SECRET_KEY- Secret key for Future AGI
Endpoints
FI_BASE_URL- HTTP collector endpoint (default:https://api.futureagi.com)FI_GRPC_URL- gRPC collector endpoint (default:https://grpc.futureagi.com)
Project
FI_PROJECT_NAME- Default project nameFI_PROJECT_VERSION_NAME- Default version name
Performance
OTEL_BSP_SCHEDULE_DELAY- Batch export delay (ms)OTEL_BSP_MAX_QUEUE_SIZE- Max queue sizeOTEL_BSP_MAX_EXPORT_BATCH_SIZE- Max batch sizeOTEL_BSP_EXPORT_TIMEOUT- Export timeout (ms)
Advanced Usage
Custom TracerProvider
from opentelemetry import trace
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from traceai_openai import OpenAIInstrumentor
# Create custom provider
provider = TracerProvider()
trace.set_tracer_provider(provider)
# Add custom exporter
exporter = OTLPSpanExporter(
endpoint="https://your-collector.com/v1/traces",
headers={"Authorization": "Bearer your-token"}
)
provider.add_span_processor(BatchSpanProcessor(exporter))
# Instrument
OpenAIInstrumentor().instrument(tracer_provider=provider)
Semantic Conventions
traceAI supports multiple semantic conventions:
from fi_instrumentation import register
from fi_instrumentation.fi_types import SemanticConvention
trace_provider = register(
project_name="my_app",
semantic_convention=SemanticConvention.FI # Default
# Or: SemanticConvention.OTEL_GENAI
# Or: SemanticConvention.OPENINFERENCE
# Or: SemanticConvention.OPENLLMETRY
)
Transport Options
from fi_instrumentation import register
from fi_instrumentation.fi_types import Transport
# HTTP (default)
trace_provider = register(
project_name="my_app",
transport=Transport.HTTP
)
# gRPC (requires grpc extras)
trace_provider = register(
project_name="my_app",
transport=Transport.GRPC
)
Examples
See framework-specific examples in each package:
Documentation
License
GPL-3.0 License
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file fi_instrumentation_otel-1.0.0.tar.gz.
File metadata
- Download URL: fi_instrumentation_otel-1.0.0.tar.gz
- Upload date:
- Size: 43.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.9.21 {"installer":{"name":"uv","version":"0.9.21","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
5834cb77874947cbe2cd97ed49dd72d709f61ce8b3c4e159bbda8b918aa2d2ec
|
|
| MD5 |
7f0ec554d7c23efbbe836fd927b72715
|
|
| BLAKE2b-256 |
67ef59bb1bf57a147d5badb0b35fc51536033bbe589a2c70504bb832a20ac316
|
File details
Details for the file fi_instrumentation_otel-1.0.0-py3-none-any.whl.
File metadata
- Download URL: fi_instrumentation_otel-1.0.0-py3-none-any.whl
- Upload date:
- Size: 48.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.9.21 {"installer":{"name":"uv","version":"0.9.21","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9af36b35c122e8be57d6834706314cc285bee84b8e2b2b3b074f55cb4d074006
|
|
| MD5 |
5049c2678370ef39b3c4c343ed13727c
|
|
| BLAKE2b-256 |
559ec7646340776f38d47838612b75987038bac740e5bb879cd52ef97d669cae
|