A provider-agnostic SDK for tracking energy usage, carbon emissions, and sustainability metrics for LLM API calls
Project description
Eco-Compute SDK
A lightweight, provider-agnostic Python SDK that wraps LLM API calls and automatically tracks energy usage, carbon emissions, cost, and sustainability metrics for Generative AI requests.
Built for enterprise ESG reporting, AI sustainability analytics, and responsible AI adoption.
Features
- 🌱 Sustainability Tracking - Automatic energy, carbon, and cost estimation
- 🔌 Provider Agnostic - Works with any LLM via OpenRouter
- 📊 Explainable Metrics - Deterministic, auditable calculations
- 🚀 Non-Intrusive - Async telemetry, never crashes host app
- 🔒 Enterprise Ready - Full type hints, ESG-compliant schema
- 📦 Drop-in Replacement - Minimal code changes required
Installation
pip install eco-compute
Or install from source:
git clone https://github.com/eco-compute/eco-compute-sdk.git
cd eco-compute-sdk
pip install -e .
Quick Start
from eco_compute import EcoCompute, EcoComputeConfig
# Configure the SDK
config = EcoComputeConfig(
telemetry_endpoint="https://api.example.com/telemetry", # Optional
telemetry_token="your-bearer-token", # Optional
region="us-west-2" # For carbon intensity calculation
)
# Create client
eco = EcoCompute(config)
# Make LLM call with sustainability tracking
result = eco.call_llm(
prompt="What is the capital of France?",
config={
"model": "openai/gpt-4o-mini",
"api_key": "your-openrouter-api-key"
}
)
# Access the original response (unchanged)
print(result["response"]["choices"][0]["message"]["content"])
# Access sustainability metrics
print(f"Energy: {result['estimation']['energy_wh']:.6f} Wh")
print(f"Carbon: {result['estimation']['co2_g']:.6f} g CO2")
print(f"Cost: ${result['estimation']['cost_usd']:.6f}")
Configuration Options
from eco_compute import EcoComputeConfig
config = EcoComputeConfig(
# Telemetry settings
telemetry_endpoint="https://api.example.com/telemetry",
telemetry_token="your-bearer-token",
telemetry_enabled=True,
# OpenRouter settings
openrouter_base_url="https://openrouter.ai/api/v1",
# Carbon calculation
region="us-west-2", # AWS Oregon (150 gCO2/kWh)
default_carbon_intensity=400.0, # gCO2/kWh fallback
# Custom model factors (Wh per 1000 tokens)
model_energy_factors={
"my-custom-model": 0.0030
},
# Custom pricing (USD per 1M tokens)
model_pricing={
"my-custom-model": {"input": 1.00, "output": 2.00}
},
# Behavior
fail_silently=True, # Never crash due to telemetry
batch_size=10, # Batch telemetry records
debug=False # Enable debug logging
)
LLM Request Configuration
result = eco.call_llm(
prompt="Your prompt here",
config={
# Required
"model": "openai/gpt-4o-mini",
"api_key": "your-api-key",
# Optional LLM parameters
"max_tokens": 1000,
"temperature": 0.7,
"top_p": 0.9,
# Optional tracking metadata
"use_case": "customer_support",
"user_id": "user-123",
"agent_id": "agent-456",
"app_id": "my-app",
"risk_level": "low", # low, medium, high, critical
"meta_data": {"department": "sales"}
}
)
Telemetry Schema
All telemetry records follow this ESG-compliant schema:
{
"id": "uuid",
"app_id": "my-app",
"model_id": "openai/gpt-4o-mini",
"user_id": "user-123",
"agent_id": "agent-456",
"timestamp": "2024-01-15T10:30:00Z",
"request_hash": "sha256-hash",
"computer_name": "DESKTOP-ABC123",
"process_name": "python",
"model_name": "openai/gpt-4o-mini",
"provider": "openai",
"tokens_input": 50,
"tokens_output": 100,
"tokens_total": 150,
"energy_wh": 0.000375,
"co2_g": 0.00005625,
"region": "us-west-2",
"carbon_intensity": 150.0,
"latency_ms": 1234.56,
"use_case": "customer_support",
"risk_level": "low",
"policy_applied": "",
"policy_action": "allow",
"cost_usd": 0.000075,
"meta_data": {"department": "sales"},
"created_at": "2024-01-15T10:30:00Z"
}
Sustainability Formulas
Energy Estimation
energy_wh = tokens_total × (model_energy_factor / 1000)
Model energy factors are derived from published ML carbon footprint research and represent estimated Wh per 1000 tokens.
Carbon Estimation
co2_g = (energy_wh / 1000) × carbon_intensity
Carbon intensity values are region-specific (gCO2/kWh) based on EPA eGRID, IEA data, and cloud provider reports.
Cost Estimation
cost_usd = (tokens_input × input_price / 1M) + (tokens_output × output_price / 1M)
Standalone Estimators
Use the estimators directly without making API calls:
from eco_compute import (
estimate_energy,
estimate_carbon,
estimate_cost,
compare_regions,
compare_models_cost,
get_cleanest_regions
)
# Estimate energy for a request
energy_wh = estimate_energy(tokens_total=1000, model_name="gpt-4")
print(f"Energy: {energy_wh} Wh")
# Estimate carbon emissions
co2_g = estimate_carbon(energy_wh=0.005, region="us-east")
print(f"Carbon: {co2_g} g CO2")
# Compare carbon across regions
comparison = compare_regions(energy_wh=1.0)
print(comparison) # {"us-east": 0.4, "eu-north-1": 0.03, ...}
# Find the cleanest regions
cleanest = get_cleanest_regions(top_n=5)
print(cleanest) # [("nuclear", 15.0), ("wind", 15.0), ...]
# Compare costs across models
cost_comparison = compare_models_cost(tokens_input=1000, tokens_output=500)
print(cost_comparison)
Pre-Request Estimation
Estimate costs before making requests:
eco = EcoCompute(config)
# Estimate without making an API call
estimate = eco.estimate_only(
tokens_input=1000,
tokens_output=500,
model="gpt-4"
)
print(f"Estimated cost: ${estimate['cost_usd']:.4f}")
print(f"Estimated carbon: {estimate['co2_g']:.6f} g CO2")
Context Manager
Use as a context manager for automatic cleanup:
from eco_compute import EcoCompute, EcoComputeConfig
with EcoCompute(EcoComputeConfig(region="eu-north-1")) as eco:
result = eco.call_llm(
"Hello!",
{"model": "openai/gpt-4o-mini", "api_key": "..."}
)
print(result["estimation"]["co2_g"])
# Telemetry is automatically flushed and resources cleaned up
Supported Models
The SDK includes energy factors and pricing for 50+ models including:
- OpenAI: GPT-4, GPT-4 Turbo, GPT-4o, GPT-3.5 Turbo, o1
- Anthropic: Claude 3 Opus/Sonnet/Haiku, Claude 3.5 Sonnet
- Google: Gemini Pro, Gemini 1.5 Pro/Flash, Gemini Ultra
- Meta: Llama 2/3/3.1 (7B to 405B), CodeLlama
- Mistral: Mistral 7B, Mixtral 8x7B/8x22B, Mistral Large
- Cohere: Command, Command-R, Command-R+
- Others: Phi-2/3, Yi-34B, Qwen-72B, DeepSeek
Unknown models use conservative default estimates.
Supported Regions
Carbon intensity data for 40+ regions including:
- US: us-east, us-west-1, us-west-2, us-central
- Europe: eu-west-1/2/3, eu-central-1, eu-north-1
- Asia Pacific: ap-northeast-1/2, ap-southeast-1/2, ap-south-1
- Other: Canada, Brazil, Middle East, Africa
Development
# Clone the repository
git clone https://github.com/eco-compute/eco-compute-sdk.git
cd eco-compute-sdk
# Install dev dependencies
pip install -e ".[dev]"
# Run type checking
mypy eco_compute
# Format code
black eco_compute
isort eco_compute
# Run linting
flake8 eco_compute
Building and Publishing
# Install build tools
pip install build twine
# Build the package
python -m build
# Upload to PyPI
twine upload dist/*
License
MIT License - see LICENSE for details.
Contributing
Contributions are welcome! Please read our Contributing Guidelines first.
Enterprise Support
For enterprise deployments, custom integrations, or sustainability consulting, contact us at ecocompute@example.com.
Built with 💚 for a sustainable AI future.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file eco_compute_sdk-0.1.0.tar.gz.
File metadata
- Download URL: eco_compute_sdk-0.1.0.tar.gz
- Upload date:
- Size: 29.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7d1c601e350bed8b221d24ebfda470403e31766a5771a1fab33a1ddef5942a14
|
|
| MD5 |
dcc3eed31d9d7554b649efd9ee70cfba
|
|
| BLAKE2b-256 |
0ed7f726d07e6fc22c8f0bd13cbad482f8268a1d5a703b341e3858832424bcb1
|
File details
Details for the file eco_compute_sdk-0.1.0-py3-none-any.whl.
File metadata
- Download URL: eco_compute_sdk-0.1.0-py3-none-any.whl
- Upload date:
- Size: 30.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
fe267368e4aa9921a863beb47ad0c5c45956a41ba5e0b3b284ff980b9f2727c1
|
|
| MD5 |
0706b906509461036d6d8ad36a238435
|
|
| BLAKE2b-256 |
5538605ac85553fb6bdeb424a05f662df859b54272e25ce07aa895cec4088b98
|