Universal Revenium middleware for Griptape framework - transparent AI usage metering across all providers
Project description
Revenium Universal Griptape Driver
Add automatic AI usage metering and cost tracking to your Griptape applications with zero code changes. Works with any LLM provider that Griptape supports - simply wrap your existing driver and get detailed analytics via Revenium.
What This Does
This package provides a universal wrapper for any Griptape prompt driver that automatically:
- Meters all AI usage across any provider and sends data to Revenium for analytics
- Tracks costs and token usage with provider-specific optimizations
- Maintains full compatibility with existing Griptape code
- Supports rich metadata for detailed reporting and cost allocation
- Auto-detects providers and uses the best available middleware
Provider Support Architecture
Tier 1: Direct Native Support ⚡
Best performance with optimized provider-specific middleware
- OpenAI →
revenium-middleware-openai - Anthropic →
revenium-middleware-anthropic - Ollama →
revenium-middleware-ollama
Tier 2: Universal LiteLLM Support 🌐
Supports 100+ providers through LiteLLM integration
- Google/Gemini → via
revenium-middleware-litellm - Cohere → via
revenium-middleware-litellm - Azure OpenAI → via
revenium-middleware-litellm - AWS Bedrock → via
revenium-middleware-litellm - LiteLLM Proxy → via
revenium-middleware-litellm(centralized proxy server) - And 100+ more → See LiteLLM docs
How It Works
- Wrap any Griptape driver with
ReveniumDriver - Add optional metadata for tracking
- Your AI calls automatically get metered and reported to Revenium
- View analytics and cost data in your Revenium dashboard
Installation
# Base package
pip install revenium-griptape
# Install middleware for your providers
pip install revenium-middleware-openai # For OpenAI (Tier 1)
pip install revenium-middleware-anthropic # For Anthropic (Tier 1)
pip install revenium-middleware-ollama # For Ollama (Tier 1)
pip install revenium-middleware-litellm # For all other providers (Tier 2)
Environment Setup
Create a .env file in your project:
# Required: Your Revenium API key for metering
REVENIUM_METERING_API_KEY=your_revenium_api_key_here
# Optional: Custom Revenium endpoint (defaults to production)
REVENIUM_METERING_BASE_URL=https://api.revenium.io/meter
# Tier 1 Provider API Keys (Direct Support)
OPENAI_API_KEY=your_openai_api_key_here
ANTHROPIC_API_KEY=your_anthropic_api_key_here
# OLLAMA_HOST=http://localhost:11434 # For local Ollama
# Tier 2 Provider API Keys (via LiteLLM - as needed)
GEMINI_API_KEY=your_google_gemini_api_key_here
COHERE_API_KEY=your_cohere_api_key_here
AWS_ACCESS_KEY_ID=your_aws_access_key_here # For Bedrock
AWS_SECRET_ACCESS_KEY=your_aws_secret_key_here # For Bedrock
AZURE_API_KEY=your_azure_openai_api_key_here # For Azure OpenAI
# LiteLLM Proxy Configuration (optional - for centralized proxy server)
LITELLM_PROXY_URL=http://localhost:4000/chat/completions
LITELLM_API_KEY=sk-1234
# See LiteLLM docs for other provider environment variables
Tier 2 Environment Variables: For Tier 2 providers, environment variable names follow LiteLLM conventions. Each provider has specific variable names (e.g.,
GEMINI_API_KEYfor Google,COHERE_API_KEYfor Cohere). See the LiteLLM provider documentation for the complete list.
Quick Start
Universal Driver (Recommended)
import os
from dotenv import load_dotenv
# ✅ CRITICAL: Load environment variables BEFORE importing
load_dotenv()
from griptape.structures import Agent
from griptape.drivers.prompt.openai_chat_prompt_driver import OpenAiChatPromptDriver
# Universal driver works with ANY Griptape prompt driver
from revenium_griptape import ReveniumDriver
# Method 1: Wrap an existing Griptape driver
base_driver = OpenAiChatPromptDriver(
model="gpt-4o-mini",
api_key=os.getenv("OPENAI_API_KEY")
)
driver = ReveniumDriver(
base_driver=base_driver,
usage_metadata={
"subscriber": {
"id": "user-123",
"email": "user@company.com",
"credential": {
"name": "user_api_key",
"value": "user_key_value"
}
},
"task_type": "qa"
}
)
# Method 2: Auto-detect from model name
driver = ReveniumDriver(
model="gpt-4o-mini", # Auto-detects OpenAI
usage_metadata={
"subscriber": {
"id": "user-123",
"email": "user@company.com",
"credential": {
"name": "user_api_key",
"value": "user_key_value"
}
},
"task_type": "qa"
}
)
# Use exactly like any Griptape driver - metering is automatic!
agent = Agent(prompt_driver=driver)
result = agent.run("What is the capital of France?")
print(result.output.value)
Direct Driver Usage
from revenium_griptape import ReveniumOpenAiDriver, ReveniumAnthropicDriver
# OpenAI (Direct)
openai_driver = ReveniumOpenAiDriver(
model="gpt-4o-mini",
api_key=os.getenv("OPENAI_API_KEY"),
usage_metadata={
"subscriber": {
"id": "user-123",
"email": "user@company.com",
"credential": {
"name": "openai_api_key",
"value": "openai_key_value"
}
}
}
)
# Anthropic (Direct)
anthropic_driver = ReveniumAnthropicDriver(
model="claude-3-haiku-20240307",
api_key=os.getenv("ANTHROPIC_API_KEY"),
usage_metadata={
"subscriber": {
"id": "user-123",
"email": "user@company.com",
"credential": {
"name": "anthropic_api_key",
"value": "anthropic_key_value"
}
}
}
)
# LiteLLM for 100+ providers
from revenium_griptape import ReveniumLiteLLMDriver
# Google Gemini via LiteLLM
gemini_driver = ReveniumLiteLLMDriver(
model="gemini-pro",
usage_metadata={
"subscriber": {
"id": "user-123",
"email": "user@company.com",
"credential": {
"name": "gemini_api_key",
"value": "gemini_key_value"
}
}
}
)
# LiteLLM Proxy
proxy_driver = ReveniumLiteLLMDriver(
model="gpt-4", # Model available through your proxy
proxy_url=os.getenv("LITELLM_PROXY_URL"),
proxy_api_key=os.getenv("LITELLM_API_KEY"),
usage_metadata={
"subscriber": {
"id": "user-123",
"email": "user@company.com",
"credential": {
"name": "proxy_api_key",
"value": "proxy_key_value"
}
}
}
)
Supported Metadata Fields
Add any of these optional fields to track usage across different dimensions:
| Field | Description | Example |
|---|---|---|
trace_id |
Session or conversation ID | "chat-session-123" |
task_type |
Type of AI task | "summarization", "qa" |
subscriber |
User information object | {"id": "user-456", "email": "user@company.com"} |
subscriber.id |
User ID | "user-456" |
subscriber.email |
User email | "user@company.com" |
subscriber.credential |
API credential info | {"name": "api_key", "value": "key_value"} |
organization_id |
Team or department | "sales-team" |
subscription_id |
Billing plan | "enterprise-plan" |
product_id |
Product or feature | "ai-assistant" |
agent |
AI agent identifier | "support-bot-v2" |
Examples
| Example | Description |
|---|---|
| Universal Driver | Recommended - Multi-provider with auto-detection |
| OpenAI Direct | Direct OpenAI integration |
| Anthropic Direct | Direct Anthropic/Claude integration |
| LiteLLM Client | LiteLLM direct client integration |
| LiteLLM Proxy | LiteLLM proxy server integration |
Each example is self-contained with environment setup, error handling, and detailed explanations.
Quick Start: Run python examples/universal_example.py after setting up your environment variables.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file revenium_griptape-0.1.0.tar.gz.
File metadata
- Download URL: revenium_griptape-0.1.0.tar.gz
- Upload date:
- Size: 64.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.1
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
88549b3224bb0afda8357bbd9561fe0fc23c895eb8b91fe97cf56050f1a2b4c5
|
|
| MD5 |
1cfc94155838cea028be396dee59e714
|
|
| BLAKE2b-256 |
7410b63e69d9b0e6619f7ef7508a9654c3f7f890d876b74e8ae11fe20ad21a55
|
File details
Details for the file revenium_griptape-0.1.0-py3-none-any.whl.
File metadata
- Download URL: revenium_griptape-0.1.0-py3-none-any.whl
- Upload date:
- Size: 21.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.1
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
32168494c8594528806be6300ab1223f4cd9de29c946b683d79b1cd1493be5d1
|
|
| MD5 |
804a8c475f88f166b9f858ae419ce4b3
|
|
| BLAKE2b-256 |
a1b09c63ce3d3ecdee54c67d074563079b92a4bdc25f74ee36739f9fd0bb78d7
|