Machine Learning Environmental Impact Analysis Framework supporting HuggingFace, scikit-learn, and PyTorch
Project description
ML-EcoLyzer
A framework for measuring the environmental impact of ML inference. Tracks CO2 emissions, energy consumption, and water usage across different hardware setups.
Why?
Training gets all the attention, but inference runs 24/7 in production. We built this to answer: "How much does running this model actually cost the environment?"
Install
Available on PyPI:
pip install ml-ecolyzer
With framework-specific dependencies:
pip install ml-ecolyzer[huggingface] # transformers, diffusers
pip install ml-ecolyzer[pytorch] # torchvision, torchaudio
pip install ml-ecolyzer[all] # everything
Quick Start
from mlecolyzer import EcoLyzer
config = {
"project": "my_analysis",
"models": [{"name": "gpt2", "task": "text"}],
"datasets": [{"name": "wikitext", "task": "text", "limit": 100}]
}
eco = EcoLyzer(config)
results = eco.run()
print(f"CO2: {results['final_report']['analysis_summary']['total_co2_emissions_kg']:.6f} kg")
print(f"Energy: {results['final_report']['analysis_summary']['total_energy_kwh']:.6f} kWh")
What it measures
- CO2 emissions - Based on power draw and regional carbon intensity
- Energy usage - Via NVIDIA-SMI, psutil, or RAPL
- Water footprint - Cooling overhead varies by hardware tier
- ESS (Environmental Sustainability Score) - Parameters per gram of CO2, useful for comparing models
ESS = Effective Parameters (M) / CO2 (g)
Higher ESS = more efficient. INT8 models typically score ~74% higher than FP32.
Supported setups
- GPUs: A100, T4, RTX series, GTX series
- CPU-only works too
- Frameworks: HuggingFace, PyTorch, scikit-learn
Config file
project: "benchmark_run"
models:
- name: "facebook/opt-350m"
task: "text"
quantization:
enabled: true
target_dtype: "int8"
datasets:
- name: "wikitext"
task: "text"
limit: 500
hardware:
device_profile: "auto"
output:
output_dir: "./results"
export_formats: ["json", "csv"]
CLI
# Single run
mlecolyzer analyze --model gpt2 --dataset wikitext --task text
# System info
mlecolyzer info
Benchmarks
Ran 1,500+ inference configs across:
- Hardware: GTX 1650, RTX 4090, Tesla T4, A100
- Models: GPT-2, OPT, Qwen, LLaMA, Phi, Whisper, ViT
- Precisions: FP32, FP16, INT8
Key findings:
- A100 has poor ESS when underutilized (overkill for small batches)
- Consumer GPUs (RTX/T4) often more efficient for single-batch inference
- Quantization helps a lot, especially INT8
Contributing
See CONTRIBUTING.md. PRs welcome.
# Dev setup
pip install -e ".[dev]"
pytest
Citation
@inproceedings{mlecolyzer2025,
title={ML-EcoLyzer: A Framework for Quantifying the Environmental Impact of Machine Learning Inference},
author={Minoza, Jose Marie Antonio and Laylo, Rex Gregor and Villarin, Christian and Ibanez, Sebastian},
booktitle={AAAI Workshop on AI for Environmental Science},
year={2025}
}
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file ml_ecolyzer-1.1.1.tar.gz.
File metadata
- Download URL: ml_ecolyzer-1.1.1.tar.gz
- Upload date:
- Size: 279.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.11.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2ea1e73c0e25a50810854508e9c11751c3cad107ab11c2f16201a86c8538b6d2
|
|
| MD5 |
d0796e78f0c73f51b8ce9e3a4aaee843
|
|
| BLAKE2b-256 |
2683266fa9c7a44ca924bb38bc9c10cc75faf23a1023d44f4ce5b6ae482a2020
|
File details
Details for the file ml_ecolyzer-1.1.1-py3-none-any.whl.
File metadata
- Download URL: ml_ecolyzer-1.1.1-py3-none-any.whl
- Upload date:
- Size: 121.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.11.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
02c8b848c656af62bbe16abf4d84d2ca88c0a4014dbc8d8c6a9d1217c88508e8
|
|
| MD5 |
452dac73817287b5b062c8c8bf699dce
|
|
| BLAKE2b-256 |
a7258e2d80034b969874d1e5661c67942b65afe76847e6f650b0dfa472d2435f
|