HoneyHive Python SDK - LLM Observability and Evaluation Platform
Project description
HoneyHive Python SDK
A comprehensive Python SDK for HoneyHive, providing LLM observability, evaluation, and tracing capabilities with OpenTelemetry integration.
๐ Features
- OpenTelemetry Integration - Full OTEL compliance with custom span processor and exporter
- Automatic Session Management - Seamless session creation and management
- Decorator Support - Easy-to-use
@trace(unified sync/async),@atrace, and@trace_classdecorators - Context Managers -
start_spanandenrich_spanfor manual span management - HTTP Instrumentation - Automatic HTTP request tracing
- Baggage Support - Context propagation across service boundaries
- Experiment Harness Integration - Automatic experiment tracking with MLflow, Weights & Biases, and Comet support
- Real-time API Integration - Direct integration with HoneyHive backend services
- Comprehensive Testing - Full test suite with 203 passing tests
๐ฆ Installation
Choose Your Instrumentor Type:
HoneyHive supports both OpenInference (lightweight) and OpenLLMetry (enhanced metrics) instrumentors.
Option A: OpenInference (Recommended for Beginners)
# Install with OpenAI integration (most common)
pip install honeyhive[openinference-openai]
# Install with Anthropic integration
pip install honeyhive[openinference-anthropic]
# Install with Google AI integration
pip install honeyhive[openinference-google-ai]
# Install with multiple providers
pip install honeyhive[openinference-openai,openinference-anthropic,openinference-google-ai]
# Install all OpenInference integrations
pip install honeyhive[all-openinference]
Option B: OpenLLMetry (Enhanced Metrics)
# Install with OpenAI integration (enhanced metrics)
pip install honeyhive[traceloop-openai]
# Install with Anthropic integration
pip install honeyhive[traceloop-anthropic]
# Install with Google AI integration
pip install honeyhive[traceloop-google-ai]
# Install with multiple providers
pip install honeyhive[traceloop-openai,traceloop-anthropic,traceloop-google-ai]
# Install all OpenLLMetry integrations
pip install honeyhive[all-traceloop]
Option C: Mix Both Types
# Strategic mixing based on your needs
pip install honeyhive[traceloop-openai,openinference-anthropic]
Basic Installation (manual instrumentor setup required):
pip install honeyhive
๐ Including in Your Project
For detailed guidance on including HoneyHive in your pyproject.toml, see our pyproject.toml Integration Guide.
Development Installation
git clone https://github.com/honeyhiveai/python-sdk.git
cd python-sdk
# Create and activate virtual environment named 'python-sdk' (required)
python -m venv python-sdk
source python-sdk/bin/activate # On Windows: python-sdk\Scripts\activate
# Install in development mode
pip install -e .
# ๐จ MANDATORY: Set up development environment (one-time setup)
./scripts/setup-dev.sh
# Verify setup (should pass all checks)
tox -e format && tox -e lint
Development Environment Setup
โ ๏ธ CRITICAL: All developers must run the setup script once:
# This installs pre-commit hooks for automatic code quality enforcement
./scripts/setup-dev.sh
Pre-commit hooks automatically enforce:
- Black formatting (88-character lines)
- Import sorting (isort with black profile)
- Static analysis (pylint + mypy)
- YAML validation (yamllint with 120-character lines)
- Documentation synchronization (feature docs, changelog)
- Tox verification (format and lint checks)
Before every commit, the system automatically runs:
- Code formatting and import sorting
- Static analysis and type checking
- Documentation build verification
- Feature documentation synchronization
- Mandatory changelog update verification
๐ง Quick Start
Basic Usage
from honeyhive import HoneyHiveTracer, trace
# Initialize tracer
tracer = HoneyHiveTracer.init(
api_key="your-api-key",
project="your-project",
source="production"
)
# Use unified decorator for automatic tracing (works with both sync and async)
@trace(event_type="demo", event_name="my_function")
def my_function():
return "Hello, World!"
@trace(event_type="demo", event_name="my_async_function")
async def my_async_function():
await asyncio.sleep(0.1)
return "Hello, Async World!"
# Manual span management
with tracer.start_span("custom-operation"):
# Your code here
pass
# With HTTP tracing enabled (new simplified API)
tracer = HoneyHiveTracer.init(
api_key="your-api-key",
source="production",
disable_http_tracing=False # project derived from API key
)
Initialization
The HoneyHiveTracer.init() method is the recommended way to initialize the tracer:
from honeyhive import HoneyHiveTracer
# Standard initialization
tracer = HoneyHiveTracer.init(
api_key="your-api-key",
source="production" # project derived from API key
)
# With custom server URL for self-hosted deployments
tracer = HoneyHiveTracer.init(
api_key="your-api-key",
source="production",
server_url="https://custom-server.com" # project derived from API key
)
Enhanced Features Available
from honeyhive import HoneyHiveTracer
from openinference.instrumentation.openai import OpenAIInstrumentor
# All features are available in the init method
tracer = HoneyHiveTracer.init(
api_key="your-api-key",
project="your-project",
source="production",
test_mode=True, # Test mode support
instrumentors=[OpenAIInstrumentor()], # Auto-integration
disable_http_tracing=True # Performance control
)
โ The init method now supports ALL constructor features!
OpenInference Integration
from honeyhive import HoneyHiveTracer
from openinference.instrumentation.openai import OpenAIInstrumentor
# Initialize tracer with OpenInference instrumentor (recommended pattern)
tracer = HoneyHiveTracer.init(
api_key="your-api-key",
project="your-project",
source="production",
instrumentors=[OpenAIInstrumentor()] # Auto-integration
)
# OpenInference automatically traces OpenAI calls
import openai
response = openai.ChatCompletion.create(
model="gpt-3.5-turbo",
messages=[{"role": "user", "content": "Hello!"}]
)
Enriching Spans and Sessions
v1.0+ Recommended Pattern: Instance Methods
from honeyhive import HoneyHiveTracer
# Initialize tracer
tracer = HoneyHiveTracer.init(
api_key="your-api-key",
project="your-project"
)
# Use instance methods for enrichment (PRIMARY - Recommended)
@tracer.trace(event_type="tool")
def my_function(input_data):
result = process_data(input_data)
# โ
Instance method (PRIMARY pattern in v1.0+)
tracer.enrich_span(
metadata={"input": input_data, "result": result},
metrics={"processing_time_ms": 150}
)
return result
# Enrich session with user properties
tracer.enrich_session(
user_properties={"user_id": "user-123", "plan": "premium"}
)
Legacy Pattern: Free Functions (Backward Compatibility)
For backward compatibility, the free function pattern from v0.2.x still works:
from honeyhive import trace, enrich_span, enrich_session
# Free functions with automatic tracer discovery (LEGACY)
@trace(event_type="tool")
def my_function(input_data):
result = process_data(input_data)
# Free function with auto-discovery (backward compatible)
enrich_span(
metadata={"input": input_data, "result": result},
metrics={"processing_time_ms": 150}
)
return result
# Enrich session via free function
enrich_session(user_properties={"user_id": "user-123"})
โ ๏ธ Deprecation Notice: Free functions will be deprecated in v2.0. We recommend migrating to instance methods for new code.
Why Instance Methods?
- โ Explicit tracer reference (no auto-discovery overhead)
- โ Better multi-instance support (multiple tracers in same process)
- โ Clearer code (explicit is better than implicit)
- โ Future-proof (primary pattern going forward)
๐๏ธ Architecture
Core Components
src/honeyhive/
โโโ api/ # API client implementations
โ โโโ client.py # Main API client
โ โโโ configurations.py # Configuration management
โ โโโ datapoints.py # Data point operations
โ โโโ datasets.py # Dataset operations
โ โโโ events.py # Event management
โ โโโ evaluations.py # Evaluation operations
โ โโโ metrics.py # Metrics operations
โ โโโ projects.py # Project management
โ โโโ session.py # Session operations
โ โโโ tools.py # Tool operations
โโโ tracer/ # OpenTelemetry integration
โ โโโ otel_tracer.py # Main tracer implementation
โ โโโ span_processor.py # Custom span processor
โ โโโ span_exporter.py # Custom span exporter
โ โโโ decorators.py # Tracing decorators
โ โโโ http_instrumentation.py # HTTP request tracing
โโโ evaluation/ # Evaluation framework
โ โโโ evaluators.py # Evaluation decorators
โโโ models/ # Pydantic models
โ โโโ generated.py # Auto-generated from OpenAPI
โโโ utils/ # Utility functions
โโโ config.py # Configuration management
โโโ connection_pool.py # HTTP connection pooling
โโโ retry.py # Retry mechanisms
โโโ logger.py # Logging utilities
Key Design Principles
- Singleton Pattern - Single tracer instance per application
- Environment Configuration - Flexible configuration via environment variables
- Graceful Degradation - Fallback mechanisms for missing dependencies
- Test Isolation - Comprehensive test suite with proper isolation
- OpenTelemetry Compliance - Full OTEL standard compliance
โ๏ธ Configuration
Environment Variables
| Variable | Description | Default |
|---|---|---|
HH_API_KEY |
HoneyHive API key | Required |
HH_API_URL |
API base URL | https://api.honeyhive.ai |
HH_PROJECT |
Project name | default |
HH_SOURCE |
Source environment | production |
HH_DISABLE_TRACING |
Disable tracing completely | false |
HH_DISABLE_HTTP_TRACING |
Disable HTTP request tracing | false |
HH_TEST_MODE |
Enable test mode | false |
HH_DEBUG_MODE |
Enable debug mode | false |
HH_VERBOSE |
Enable verbose API logging | false |
HH_OTLP_ENABLED |
Enable OTLP export | true |
Experiment Harness Variables
| Variable | Description | Default |
|---|---|---|
HH_EXPERIMENT_ID |
Unique experiment identifier | None |
HH_EXPERIMENT_NAME |
Human-readable experiment name | None |
HH_EXPERIMENT_VARIANT |
Experiment variant/treatment | None |
HH_EXPERIMENT_GROUP |
Experiment group/cohort | None |
HH_EXPERIMENT_METADATA |
JSON experiment metadata | None |
HTTP Client Configuration
| Variable | Description | Default |
|---|---|---|
HH_MAX_CONNECTIONS |
Maximum HTTP connections | 100 |
HH_MAX_KEEPALIVE_CONNECTIONS |
Keepalive connections | 20 |
HH_KEEPALIVE_EXPIRY |
Keepalive expiry (seconds) | 30.0 |
HH_POOL_TIMEOUT |
Connection pool timeout | 30.0 |
HH_RATE_LIMIT_CALLS |
Rate limit calls per window | 1000 |
HH_RATE_LIMIT_WINDOW |
Rate limit window (seconds) | 60.0 |
HH_HTTP_PROXY |
HTTP proxy URL | None |
HH_HTTPS_PROXY |
HTTPS proxy URL | None |
HH_NO_PROXY |
Proxy bypass list | None |
HH_VERIFY_SSL |
SSL verification | true |
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file honeyhive-1.0.0rc7.tar.gz.
File metadata
- Download URL: honeyhive-1.0.0rc7.tar.gz
- Upload date:
- Size: 3.5 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.8
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7b5f08d3259cfedfe23e3bcf8189fdd681d1f815fde290ca961c9041619775ac
|
|
| MD5 |
99c20c64f64fec637269da473b1ff82a
|
|
| BLAKE2b-256 |
1a16b7e9520921a8997379db5e931434540b2ea9b1b1945b5143659b98647c16
|
File details
Details for the file honeyhive-1.0.0rc7-py3-none-any.whl.
File metadata
- Download URL: honeyhive-1.0.0rc7-py3-none-any.whl
- Upload date:
- Size: 280.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.8
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6b1e6ca43de192ee1274088ffd0e80fc25ccab4f8e77e4a5dd4c57fec5b28289
|
|
| MD5 |
963c1fa0e3cb90681fb14b90ee3ea3b3
|
|
| BLAKE2b-256 |
add42aa82ab3981cee6d220335184df0b219a1a5fd47f6716bc3e2f4b2d4f943
|