Skip to main content

A Python library that meters Anthropic usage to Revenium.

Project description

Revenium Middleware for Anthropic

PyPI version Python Versions Documentation License: MIT

A production-ready middleware library for metering and monitoring Anthropic API usage in Python applications. Supports both direct Anthropic API and AWS Bedrock with comprehensive streaming functionality.

Features

  • Precise Usage Tracking: Monitor tokens, costs, and request counts for Anthropic chat completions
  • Seamless Integration: Drop-in middleware that works with minimal code changes
  • AWS Bedrock Support: Full integration with automatic detection and metering for Anthropic models via AWS Bedrock
  • Complete Streaming Support: Full streaming functionality for both Anthropic API and AWS Bedrock
  • Hybrid Initialization: Auto-initialization on import + explicit control for advanced configuration
  • Thread-Safe: Production-ready with comprehensive thread safety for concurrent applications
  • Flexible Configuration: Customize metering behavior to suit your application needs

What's Supported

Feature Direct Anthropic API AWS Bedrock
Chat Completion Full support Full support
Streaming Full support Full support
Token Metering Automatic Automatic
Metadata Tracking Full support Full support
Thread Safety Production-ready Production-ready
Auto-initialization Zero-config Zero-config

Note: The middleware only wraps messages.create and messages.stream endpoints. Other Anthropic SDK features work normally but aren't metered.

Installation

# Basic installation
pip install revenium-middleware-anthropic

# With AWS Bedrock support
pip install revenium-middleware-anthropic[bedrock]

Quick Start

Set your environment variables and import the middleware - your Anthropic calls will be metered automatically:

import anthropic
import revenium_middleware_anthropic  # Auto-initializes on import

client = anthropic.Anthropic()
message = client.messages.create(
    model="claude-3-haiku-20240307",
    max_tokens=100,
    messages=[{"role": "user", "content": "Please verify you are ready to assist me."}]
)
print(message.content[0].text)

That's it! The middleware automatically meters all Anthropic API calls. No code changes required.

For complete examples, see:

Configuration

Set these environment variables before running your application:

export REVENIUM_METERING_API_KEY="your-api-key"
export REVENIUM_METERING_BASE_URL="https://api.revenium.ai/meter"
export ANTHROPIC_API_KEY="your-anthropic-key"

Optional: Enable debug logging with export REVENIUM_LOG_LEVEL=DEBUG

AWS Bedrock Integration

The middleware provides complete AWS Bedrock integration with automatic detection and full streaming support.

Provider Detection: The middleware automatically chooses between Bedrock and direct Anthropic API based on:

  • AWS credentials availability (aws configure, IAM roles, environment variables)
  • Base URL detection (when base_url contains amazonaws.com)
  • Defaults to direct Anthropic API for safety - Bedrock only used when explicitly configured

See examples/anthropic-bedrock.py for complete working examples covering:

  • Basic chat completion via AWS Bedrock
  • Metadata tracking with Bedrock
  • Streaming support with Bedrock
  • Model mapping and configuration

Bedrock Configuration

Environment Variables

Variable Description Default
REVENIUM_METERING_API_KEY Your Revenium API key Required
REVENIUM_METERING_BASE_URL Revenium API endpoint Required
AWS_REGION AWS region for Bedrock us-east-1
REVENIUM_BEDROCK_DISABLE Set to 1 to disable Bedrock support Not set

AWS Authentication

The middleware uses the standard AWS credential chain:

  1. Environment variables (AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY)
  2. AWS credentials file (~/.aws/credentials)
  3. IAM roles (for EC2/Lambda/ECS)
  4. AWS SSO

Required AWS permissions:

  • bedrock:InvokeModel (for non-streaming requests)
  • bedrock:InvokeModelWithResponseStream (for streaming requests)

Supported Models

The middleware automatically maps Anthropic model names to Bedrock model IDs:

Anthropic Model Bedrock Model ID
claude-3-opus-20240229 anthropic.claude-3-opus-20240229-v1:0
claude-3-sonnet-20240229 anthropic.claude-3-sonnet-20240229-v1:0
claude-3-haiku-20240307 us.anthropic.claude-3-5-haiku-20241022-v1:0
claude-3-5-sonnet-20240620 anthropic.claude-3-5-sonnet-20240620-v1:0
claude-3-5-sonnet-20241022 anthropic.claude-3-5-sonnet-20241022-v2:0
claude-3-5-haiku-20241022 anthropic.claude-3-5-haiku-20241022-v1:0

For other models, the middleware uses the format anthropic.{model_name}.

Streaming Support

The middleware provides complete streaming support for both direct Anthropic API and AWS Bedrock with:

  • Universal Interface: Same code works with both providers
  • Automatic Detection: Provider routing happens transparently
  • Complete Token Tracking: Accurate token counting and metering
  • Thread-Safe: Production-ready concurrent streaming support
  • Graceful Fallback: Automatic fallback to direct API if Bedrock fails

See examples/anthropic-streaming.py for complete streaming examples with metadata tracking and error handling.

Metadata Fields

Add business context to track usage by organization, user, task type, or custom fields. Pass a usage_metadata dictionary with any of these optional fields:

Field Description Use Case
trace_id Unique identifier for session or conversation tracking Link multiple API calls together for debugging, user session analytics, or distributed tracing across services
task_type Type of AI task being performed Categorize usage by workload (e.g., "chat", "code-generation", "doc-summary") for cost analysis and optimization
subscriber.id Unique user identifier Track individual user consumption for billing, rate limiting, or user analytics
subscriber.email User email address Identify users for support, compliance, or usage reports
subscriber.credential.name Authentication credential name Track which API key or service account made the request
subscriber.credential.value Authentication credential value Associate usage with specific credentials for security auditing
organization_id Organization or company identifier Multi-tenant cost allocation, usage quotas per organization
subscription_id Subscription plan identifier Track usage against subscription limits, identify plan upgrade opportunities
product_id Your product or feature identifier Attribute AI costs to specific features in your application (e.g., "chatbot", "email-assistant")
agent AI agent or bot identifier Distinguish between multiple AI agents or automation workflows in your system
response_quality_score Custom quality rating (0.0-1.0) Track user satisfaction or automated quality metrics for model performance analysis

Resources:

Troubleshooting

Issue Solution
"No module named 'boto3'" Install with Bedrock support: pip install revenium-middleware-anthropic[bedrock]
Requests go to Anthropic instead of Bedrock Verify AWS credentials: aws sts get-caller-identity
"AccessDenied" errors Ensure AWS credentials have bedrock:InvokeModel and bedrock:InvokeModelWithResponseStream permissions
Model not available Check if Claude models are available in your AWS region
Middleware not working Verify REVENIUM_METERING_API_KEY and REVENIUM_METERING_BASE_URL are set
Streaming errors Check AWS credentials; middleware automatically falls back to direct API

Debug Mode: Set REVENIUM_LOG_LEVEL=DEBUG to see provider detection and routing decisions Force Direct API: Set REVENIUM_BEDROCK_DISABLE=1 to disable Bedrock detection Check Status: Use revenium_middleware_anthropic.is_initialized() to verify setup

Compatibility

  • Python 3.8+
  • Anthropic Python SDK (latest version recommended)
  • AWS Bedrock (with boto3>=1.34.0 when using [bedrock] extra)
  • Thread-Safe (production-ready for concurrent applications)

Logging

Control logging with the REVENIUM_LOG_LEVEL environment variable. Available levels:

  • DEBUG: Detailed debugging information (provider detection, routing decisions)
  • INFO: General information (default)
  • WARNING: Warning messages only
  • ERROR: Error messages only

Documentation

For detailed documentation, visit docs.revenium.io

Contributing

See CONTRIBUTING.md

Code of Conduct

See CODE_OF_CONDUCT.md

Security

See SECURITY.md

License

This project is licensed under the MIT License - see the LICENSE file for details.

Support

For issues, feature requests, or contributions:


Built by Revenium

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

revenium_middleware_anthropic-0.2.25.tar.gz (35.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

revenium_middleware_anthropic-0.2.25-py3-none-any.whl (23.1 kB view details)

Uploaded Python 3

File details

Details for the file revenium_middleware_anthropic-0.2.25.tar.gz.

File metadata

File hashes

Hashes for revenium_middleware_anthropic-0.2.25.tar.gz
Algorithm Hash digest
SHA256 5da3b8f73b331da2fe06abe556312ab1e7367ea546059fd9c64028b75d03e20a
MD5 eb159606b5ec1f16d3a144b04ace03cd
BLAKE2b-256 2e7c240fbe25729e394ee1657ca79af7a9810814b4c9905f2b7ba4d2df51db39

See more details on using hashes here.

File details

Details for the file revenium_middleware_anthropic-0.2.25-py3-none-any.whl.

File metadata

File hashes

Hashes for revenium_middleware_anthropic-0.2.25-py3-none-any.whl
Algorithm Hash digest
SHA256 aec0132238cdf8a15ece36110838766e10e9f87748bb5cf2d839613dccad76a7
MD5 be49d7f9bc41be38d751f2b73d176d5b
BLAKE2b-256 dd281d9cf1464652731204f135047ce4eb3095c4c7640db245e490e2f35ca736

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page