Carbon-aware scheduling and telemetry for LLM applications
Project description
llm-eco-tracker
llm-eco-tracker is a Python library for carbon-aware LLM execution. It wraps your application code with a @carbon_aware decorator, delays non-urgent work into greener grid windows, captures session energy through EcoLogits, and writes telemetry you can analyze later.
It is designed for normal application code, scheduled batch jobs, and agentic workflows built on top of OpenAI- and Anthropic-backed stacks such as LangChain.
Why It Exists
Most LLM applications run immediately, even when the grid is unusually carbon-intensive. llm-eco-tracker gives you a lightweight software layer that can:
- delay flexible work until a cleaner grid window
- record baseline vs actual carbon emissions
- downgrade to smaller models on dirty grids
- stop a run when a per-session carbon budget is exceeded
- plug into existing OpenAI and Anthropic SDK usage without infrastructure changes
Features
- Carbon-aware scheduling with configurable delay budgets
- Forecast providers for UK Carbon Intensity, Electricity Maps, and deterministic CSV traces
- Telemetry adapters for OpenAI chat completions and Anthropic messages
- Per-session model usage summaries
- Dirty-grid eco-fallbacks / model downgrades
- Carbon circuit breaker / budget enforcement
- JSONL, logger, composite, and no-op telemetry sinks
- Benchmark, analysis, and figure-generation scripts for evaluation
Installation
From Source
pip install -r requirements.txt
pip install -e .
From PyPI
pip install llm-eco-tracker
Optional Extras
pip install "llm-eco-tracker[langchain]"
pip install "llm-eco-tracker[benchmarks]"
pip install "llm-eco-tracker[dev]"
Quickstart
from openai import OpenAI
from llm_eco_tracker import carbon_aware
client = OpenAI()
@carbon_aware(max_delay_hours=2)
def summarize(text: str) -> str:
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=[
{"role": "system", "content": "You are concise."},
{"role": "user", "content": text},
],
)
return response.choices[0].message.content
By default, telemetry is written to ./eco_telemetry.jsonl.
Provider Support
Forecast Providers
UKCarbonIntensityProviderElectricityMapsProviderCsvForecastProvider
Telemetry Adapters
- OpenAI
client.chat.completions.create(...) - Anthropic
client.messages.create(...)
LangChain / Agentic Workflows
llm-eco-tracker works with LangChain-style workflows when the workflow itself is orchestrated by LangChain and the actual model calls inside that workflow go through supported SDKs such as OpenAI or Anthropic. That is the safest integration pattern today, and it works well for agent loops, planners, critics, and multi-step chains.
The repository includes a runnable mocked demo at langchain_agentic_demo.py. It uses LangChain runnables to orchestrate a multi-step workflow while the underlying OpenAI SDK calls are intercepted by @carbon_aware.
Forecast Provider Examples
Deterministic CSV Trace
from llm_eco_tracker import carbon_aware
from llm_eco_tracker.providers import CsvForecastProvider
@carbon_aware(
max_delay_hours=2,
forecast_provider=CsvForecastProvider("tests/fixtures/mock_forecast.csv"),
)
def run_batch_job():
...
Electricity Maps
import os
from llm_eco_tracker import carbon_aware
from llm_eco_tracker.providers import ElectricityMapsProvider
electricity_maps = ElectricityMapsProvider(
zone="DE",
auth_token=os.environ["ELECTRICITY_MAPS_API_TOKEN"],
)
@carbon_aware(
max_delay_hours=2,
forecast_provider=electricity_maps,
)
def run_with_electricity_maps():
...
Eco-Fallbacks and Carbon Budgets
Dirty-Grid Model Downgrade
from llm_eco_tracker import carbon_aware
@carbon_aware(
max_delay_hours=2,
auto_downgrade=True,
dirty_threshold=300.0,
model_fallbacks={"gpt-4.1": "gpt-4.1-mini"},
)
def run_with_fallback():
...
Circuit Breaker
from llm_eco_tracker import CarbonBudgetExceededError, carbon_aware
@carbon_aware(max_session_gco2eq=5.0)
def run_budgeted_workflow():
...
try:
run_budgeted_workflow()
except CarbonBudgetExceededError as exc:
print(exc.actual_gco2eq, exc.max_session_gco2eq)
Telemetry
By default, each decorated run emits a normalized telemetry record containing:
- timestamp
- captured energy in
kWh - baseline and actual emissions in
gCO2eq - saved carbon
- selected schedule plan
- forecast provider
- LLM provider
- effective model
- per-session model usage summary
The built-in telemetry sinks are:
JsonlTelemetrySinkLoggerTelemetrySinkCompositeTelemetrySinkNoOpTelemetrySink
CLI
The package ships a telemetry report CLI:
ecotracker-report
python -m llm_eco_tracker.report
You can also point it at custom telemetry inputs:
ecotracker-report path/to/eco_telemetry.jsonl
python -m llm_eco_tracker.report app.log --format logger
Demo Scripts
These scripts are designed to be screenshot-friendly and work without paid API traffic.
- langchain_agentic_demo.py Runs a mocked LangChain workflow that makes multiple OpenAI calls inside one decorated function.
- eco_fallback_demo.py Demonstrates dirty-grid model downgrading with a deterministic CSV forecast.
- circuit_breaker_demo.py Demonstrates the carbon budget / circuit breaker aborting a session.
Run them from the repository root:
python scripts/langchain_agentic_demo.py
python scripts/eco_fallback_demo.py
python scripts/circuit_breaker_demo.py
Evaluation Pipeline
The repository includes a full benchmark and analysis pipeline:
python scripts/run_benchmark.py
python scripts/analyze_benchmark_results.py
python scripts/run_openai_integration_benchmark.py
python scripts/run_overhead_benchmark.py
python scripts/generate_paper_figures.py
Generated outputs include:
scenario_results.csvdaily_summary.csvbenchmark_summary.jsonbenchmark_analysis.jsonbenchmark_analysis.mdpaper_figures/
License
This project is licensed under the MIT License. See LICENSE.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file llm_eco_tracker-0.1.0.tar.gz.
File metadata
- Download URL: llm_eco_tracker-0.1.0.tar.gz
- Upload date:
- Size: 34.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2fc46718215e8ab21c5467b7a8ba95326dc4a750ed5304b34b6d7c4927845173
|
|
| MD5 |
00cad3b22cf0b9fc996f34edc800fc66
|
|
| BLAKE2b-256 |
5be1ef2a3be4fae01a7f77d5507ce24e0fc1e6b061a25b5ad54458bc9797e26e
|
Provenance
The following attestation bundles were made for llm_eco_tracker-0.1.0.tar.gz:
Publisher:
publish-pypi.yml on DanielRachev/llm-eco-tracker
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
llm_eco_tracker-0.1.0.tar.gz -
Subject digest:
2fc46718215e8ab21c5467b7a8ba95326dc4a750ed5304b34b6d7c4927845173 - Sigstore transparency entry: 1189305225
- Sigstore integration time:
-
Permalink:
DanielRachev/llm-eco-tracker@0941ae1c48dddf4f77173ba550abcb68e09f8587 -
Branch / Tag:
refs/tags/0.1.0 - Owner: https://github.com/DanielRachev
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish-pypi.yml@0941ae1c48dddf4f77173ba550abcb68e09f8587 -
Trigger Event:
release
-
Statement type:
File details
Details for the file llm_eco_tracker-0.1.0-py3-none-any.whl.
File metadata
- Download URL: llm_eco_tracker-0.1.0-py3-none-any.whl
- Upload date:
- Size: 33.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
81595bbacd289ba3980f877f71814614022b61c83fbb02149dc1b5de784177d9
|
|
| MD5 |
40ca69e17dfac5621c4aa7646bfcee7b
|
|
| BLAKE2b-256 |
1098ac101f4eac2117598c41029257c68b21d9cd6289165799972b2e9575e985
|
Provenance
The following attestation bundles were made for llm_eco_tracker-0.1.0-py3-none-any.whl:
Publisher:
publish-pypi.yml on DanielRachev/llm-eco-tracker
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
llm_eco_tracker-0.1.0-py3-none-any.whl -
Subject digest:
81595bbacd289ba3980f877f71814614022b61c83fbb02149dc1b5de784177d9 - Sigstore transparency entry: 1189305227
- Sigstore integration time:
-
Permalink:
DanielRachev/llm-eco-tracker@0941ae1c48dddf4f77173ba550abcb68e09f8587 -
Branch / Tag:
refs/tags/0.1.0 - Owner: https://github.com/DanielRachev
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish-pypi.yml@0941ae1c48dddf4f77173ba550abcb68e09f8587 -
Trigger Event:
release
-
Statement type: