Machine Learning Environmental Impact Analysis Framework supporting HuggingFace, scikit-learn, and PyTorch
Project description
ML-EcoLyzer: Machine Learning Environmental Impact Analysis Framework
ML-EcoLyzer is a reproducible benchmarking and analysis framework for quantifying the environmental cost of machine learning inference. It supports modern transformers, vision models, and classical ML pipelines, adaptable to both edge and datacenter-scale deployments.
Environmental profiling across tasks, models, and hardware tiers.
🌍 Key Features
- Inference-Centric Analysis: Quantifies CO₂ emissions, energy use, and water impact from real-time inference
- Cross-Hardware Profiling: Supports A100, T4, RTX, GTX, and CPU-only setups
- Model-Agnostic Framework: Runs LLMs, ViTs, audio models, and traditional ML
- ESS Metric: Introduces the Environmental Sustainability Score for normalized emissions comparison
- Quantization Insights: Analyzes FP16 and INT8 savings for sustainable deployment
- Frequency-Aware Monitoring: Adjusts sampling dynamically for short and long-running workloads
- Lightweight and Extensible: Runs on mobile, edge, and low-resource devices
📊 What It Measures
✅ CO₂ Emissions
- Based on PUE, regional carbon intensity, and power consumption
- Adaptive to cloud, desktop, or edge scenarios
✅ Energy Usage
- Instantaneous power profiling via NVIDIA-SMI,
psutil, or RAPL - Sample-level granularity for each inference configuration
✅ Water Footprint
- Derived from power-to-water coefficients by tier (e.g., datacenter vs. mobile)
✅ Environmental Sustainability Score (ESS) Metric
$$\text{ESS} = \frac{\text{Effective Parameters (M)}}{\text{CO₂ (g)}}$$
- A normalized environmental efficiency metric for sustainable ML comparisons
🔧 Installation
pip install ml-ecolyzer
🚀 Quick Example
from mlecolyzer import EcoLyzer
config = {
"project": "sustainability_demo",
"models": [{"name": "gpt2", "task": "text"}],
"datasets": [{"name": "wikitext", "task": "text"}]
}
eco = EcoLyzer(config)
results = eco.run()
print(f"CO₂: {results['final_report']['analysis_summary']['total_co2_emissions_kg']:.6f} kg")
📚 Scientific Foundation
Built on rigorously defined environmental assessment literature and standards:
- IEEE 754 (numeric precision)
- ASHRAE TC 9.9 (thermal/infra cooling)
- JEDEC JESD51 (thermal/power envelopes)
- Strubell et al. (2019), Patterson et al. (2021), Henderson et al. (2020), Lacoste et al. (2019)
🔬 Benchmark Coverage
- 1,500+ inference runs
- 4 hardware tiers: GTX 1650, RTX 4090, Tesla T4, A100
- Tasks: text, audio, vision, classification, regression
- Model families: GPT-2, OPT, Qwen, LLaMA, Phi, Whisper, ViT, etc.
- Precisions: FP32, FP16, INT8
🛠️ Configuration Template
project: "ml_sustainability_benchmark"
models:
- name: "facebook/opt-2.7b"
task: "text"
quantization:
enabled: true
method: "dynamic"
target_dtype: "int8"
datasets:
- name: "wikitext"
task: "text"
limit: 1000
monitoring:
frequency_hz: 5
enable_quantization_analysis: true
hardware:
device_profile: "auto"
output:
export_formats: ["json", "csv"]
output_dir: "./results"
🧪 Research Insights
Quantization Efficiency
INT8 models show up to 74% higher ESS than FP32 equivalents.
Hardware Utilization
A100 performs worst in ESS when underutilized; RTX/T4 yield better emissions-per-parameter for single-batch workloads.
Task-Wise Trends
Traditional models like SVC or Logistic Regression incur high ECEP due to small parameter count, despite low energy.
📜 Citation
@inproceedings{mlecolyzer2025,
title={ML-EcoLyzer: Comprehensive Environmental Impact Analysis for Machine Learning Systems},
author={Minoza, Jose Marie Antonio and Laylo, Rex Gregor and Villarin, Christian and Ibanez, Sebastian},
booktitle={Proceedings of the Asian Conference on Machine Learning (ACML)},
year={2025}
}
ML-EcoLyzer — Advancing sustainable inference in resource-constrained and production-scale deployments. 🌱
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file ml_ecolyzer-1.1.0.tar.gz.
File metadata
- Download URL: ml_ecolyzer-1.1.0.tar.gz
- Upload date:
- Size: 280.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.11.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b961ede18a01117777f66844167f14903978a34202291bf5d15101ffed58a7e6
|
|
| MD5 |
6cb6aa92404f3b2415d005eb3a087e88
|
|
| BLAKE2b-256 |
d189f0ff93fd43ccca1b7cee7363df778bbbbe029523625b3237c3daf68ae6e6
|
File details
Details for the file ml_ecolyzer-1.1.0-py3-none-any.whl.
File metadata
- Download URL: ml_ecolyzer-1.1.0-py3-none-any.whl
- Upload date:
- Size: 122.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.11.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
60250494f9657642ede62ec96ed2923365a73bcb094bfed5d29c3eb08f42a16b
|
|
| MD5 |
801995a1c42300a3fb926f1713b6483f
|
|
| BLAKE2b-256 |
3771559521d927d623ceaa8b12d9f2025397abf22771dbf6fb15f3e63a08fbb4
|