No project description provided
Project description
EQODEC
EQODEC is a lightweight Python package that provides carbon-aware loss functions and energy-efficiency evaluation metrics for neural compression and representation learning systems.
It enables researchers and engineers to explicitly account for carbon cost during model training and evaluation, rather than treating emissions as a post-hoc reporting metric.
Motivation
The rapid growth of neural compression, generative models, and large-scale inference has significantly increased the energy and carbon footprint of data storage and transmission systems. While existing codecs optimize for rate-distortion trade-offs, they do not account for environmental impact during training or inference.
EQODEC addresses this gap by:
- Integrating carbon awareness directly into the training objective
- Providing a standardized Energy-Efficiency Score (EES) for fair evaluation
- Remaining model-agnostic and framework-lightweight
Key Features
-
Carbon-Aware Loss Function
- Drop-in replacement for standard rate-distortion losses
- Supports reconstruction, bitrate, and carbon regularization terms
-
Energy-Efficiency Score (EES)
- Quantifies storage savings per unit carbon cost
- Suitable for neural codecs, traditional codecs, and hybrid pipelines
-
Framework-Agnostic Design
- Works with any neural compression architecture
- Minimal assumptions about latent representations
-
CodeCarbon Integration
- Uses real-world carbon intensity estimates
- Graceful fallback when regional data is unavailable
Installation
pip install eqodec
Requirements
- Python>=3.8
- PyTorch
- NumPy
- CodeCarbon
Quick Start
1. Carbon-Aware Loss Function
import torch
from eqodec import CarbonAwareLoss, get_local_carbon_intensity
carbon_intensity = get_local_carbon_intensity()
criterion = CarbonAwareLoss(
lambda_recon=5.0,
lambda_rate=0.5,
lambda_carbon=0.005,
carbon_intensity=carbon_intensity
)
loss, metrics = criterion(x_hat, x, latent)
2. Energy-Efficiency Score (EES)
from eqodec import energy_efficiency_score
ees = energy_efficiency_score(
baseline_bytes=2.3e9,
compressed_bytes=1.9e9,
kg_co2=0.82
)
Higher values indicate more sustainable compression.
Use Cases
- Neural image, video, and audio compression
- Learned entropy models and autoencoders
- Carbon-aware representation learning
- Sustainability benchmarking of codecs
- Research on green AI and efficient ML systems
Design Philosophy
EQODEC is intentionally:
- Minimal - no datasets, no training loops
- Composable - integrates into existing pipelines
- Transparent - interpretable loss components
- Research-first - aligned with reproducible evaluation
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file eqodec-0.4.0.tar.gz.
File metadata
- Download URL: eqodec-0.4.0.tar.gz
- Upload date:
- Size: 3.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
310b74a3f4fba4527ad2680773b6ed770734a72ee99b548414571b1ac845018e
|
|
| MD5 |
e074f7392a2f2da40e0af5e3d634621f
|
|
| BLAKE2b-256 |
afa365772a46d06131e52d9b4eee817dadb4e2f0a418ef71e22a19edd2edbfc2
|
File details
Details for the file eqodec-0.4.0-py3-none-any.whl.
File metadata
- Download URL: eqodec-0.4.0-py3-none-any.whl
- Upload date:
- Size: 4.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
001e0177897a1dc62b987e48eac90a103a6f039cbb8d25278a537587ec0d627d
|
|
| MD5 |
230442b634d566bd681abf26776e666b
|
|
| BLAKE2b-256 |
620d23db1091d2dab2bc9ac25d91dd8fa5621b57aeb1515c5a93eab277b312ac
|