Skip to main content

Machine Learning Environmental Impact Analysis Framework supporting HuggingFace, scikit-learn, and PyTorch

Project description

ML-EcoLyzer

Python 3.8+ License: MIT PyPI version arXiv

A framework for measuring the environmental impact of ML inference. Tracks CO2 emissions, energy consumption, and water usage across different hardware setups.

ML-EcoLyzer Overview

Why?

Training gets all the attention, but inference runs 24/7 in production. We built this to answer: "How much does running this model actually cost the environment?"

Install

Available on PyPI:

pip install ml-ecolyzer

With framework-specific dependencies:

pip install ml-ecolyzer[huggingface]  # transformers, diffusers
pip install ml-ecolyzer[pytorch]       # torchvision, torchaudio
pip install ml-ecolyzer[all]           # everything

Quick Start

from mlecolyzer import EcoLyzer

config = {
    "project": "my_analysis",
    "models": [{"name": "gpt2", "task": "text"}],
    "datasets": [{"name": "wikitext", "task": "text", "limit": 100}]
}

eco = EcoLyzer(config)
results = eco.run()

print(f"CO2: {results['final_report']['analysis_summary']['total_co2_emissions_kg']:.6f} kg")
print(f"Energy: {results['final_report']['analysis_summary']['total_energy_kwh']:.6f} kWh")

What it measures

  • CO2 emissions - Based on power draw and regional carbon intensity
  • Energy usage - Via NVIDIA-SMI, psutil, or RAPL
  • Water footprint - Cooling overhead varies by hardware tier
  • ESS (Environmental Sustainability Score) - Parameters per gram of CO2, useful for comparing models
ESS = Effective Parameters (M) / CO2 (g)

Higher ESS = more efficient. INT8 models typically score ~74% higher than FP32.

Supported setups

  • GPUs: A100, T4, RTX series, GTX series
  • CPU-only works too
  • Frameworks: HuggingFace, PyTorch, scikit-learn

Config file

project: "benchmark_run"

models:
  - name: "facebook/opt-350m"
    task: "text"
    quantization:
      enabled: true
      target_dtype: "int8"

datasets:
  - name: "wikitext"
    task: "text"
    limit: 500

hardware:
  device_profile: "auto"

output:
  output_dir: "./results"
  export_formats: ["json", "csv"]

CLI

# Single run
mlecolyzer analyze --model gpt2 --dataset wikitext --task text

# System info
mlecolyzer info

Benchmarks

Ran 1,500+ inference configs across:

  • Hardware: GTX 1650, RTX 4090, Tesla T4, A100
  • Models: GPT-2, OPT, Qwen, LLaMA, Phi, Whisper, ViT
  • Precisions: FP32, FP16, INT8

Key findings:

  • A100 has poor ESS when underutilized (overkill for small batches)
  • Consumer GPUs (RTX/T4) often more efficient for single-batch inference
  • Quantization helps a lot, especially INT8

Contributing

See CONTRIBUTING.md. PRs welcome.

# Dev setup
pip install -e ".[dev]"
pytest

Citation

@misc{mlecolyzer2026,
  title={ML-EcoLyzer: A Framework for Quantifying the Environmental Impact of Machine Learning Inference},
  author={Minoza, Jose Marie Antonio and Laylo, Rex Gregor and Villarin, Christian and Ibanez, Sebastian},
  year={2026},
  note={AAAI Workshop on AI for Environmental Science},
  eprint={2511.06694},
  archivePrefix={arXiv},
  primaryClass={cs.LG},
  doi={10.48550/arXiv.2511.06694}
}

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ml_ecolyzer-1.1.2.tar.gz (279.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ml_ecolyzer-1.1.2-py3-none-any.whl (121.6 kB view details)

Uploaded Python 3

File details

Details for the file ml_ecolyzer-1.1.2.tar.gz.

File metadata

  • Download URL: ml_ecolyzer-1.1.2.tar.gz
  • Upload date:
  • Size: 279.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.9

File hashes

Hashes for ml_ecolyzer-1.1.2.tar.gz
Algorithm Hash digest
SHA256 e575e59e0eef2f548c1054a2f690e6a57cfc2ce5ba60ac1c5f9761ad206d6fc1
MD5 c5c6a86d81e7e5309b081aed28c7b593
BLAKE2b-256 15a69341b7ed7dc27020467e35e8f941779a969df8d147ac7e8ed28dc7a4551a

See more details on using hashes here.

File details

Details for the file ml_ecolyzer-1.1.2-py3-none-any.whl.

File metadata

  • Download URL: ml_ecolyzer-1.1.2-py3-none-any.whl
  • Upload date:
  • Size: 121.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.9

File hashes

Hashes for ml_ecolyzer-1.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 9c2426a5f1a878e6fa693c3a1441ff4ec43868cc001f7be592f48efb4bea68a6
MD5 1fa89c4a0c33558a5c86f2b21918149b
BLAKE2b-256 9e86f20cbb0d7e9fb99d57f282882b77a27c676e47acd334e54ce1fdf8d856ca

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page