Skip to main content

No project description provided

Project description

EQODEC

EQODEC is a lightweight Python package that provides carbon-aware loss functions and energy-efficiency evaluation metrics for neural compression and representation learning systems.

It enables researchers and engineers to explicitly account for carbon cost during model training and evaluation, rather than treating emissions as a post-hoc reporting metric.


Motivation

The rapid growth of neural compression, generative models, and large-scale inference has significantly increased the energy and carbon footprint of data storage and transmission systems. While existing codecs optimize for rate-distortion trade-offs, they do not account for environmental impact during training or inference.

EQODEC addresses this gap by:

  • Integrating carbon awareness directly into the training objective
  • Providing a standardized Energy-Efficiency Score (EES) for fair evaluation
  • Remaining model-agnostic and framework-lightweight

Key Features

  • Carbon-Aware Loss Function

    • Drop-in replacement for standard rate–distortion losses
    • Supports reconstruction, bitrate, and carbon regularization terms
  • Energy-Efficiency Score (EES)

    • Quantifies storage savings per unit carbon cost
    • Suitable for neural codecs, traditional codecs, and hybrid pipelines
  • Framework-Agnostic Design

    • Works with any neural compression architecture
    • Minimal assumptions about latent representations
  • CodeCarbon Integration

    • Uses real-world carbon intensity estimates
    • Graceful fallback when regional data is unavailable

Installation

pip install eqodec

Requirements

  • Python>=3.8
  • PyTorch
  • NumPy
  • CodeCarbon

Quick Start

1. Carbon-Aware Loss Function

import torch
from eqodec import CarbonAwareLoss, get_local_carbon_intensity

carbon_intensity = get_local_carbon_intensity()

criterion = CarbonAwareLoss(
    lambda_recon=5.0,
    lambda_rate=0.5,
    lambda_carbon=0.005,
    carbon_intensity=carbon_intensity
)

loss, metrics = criterion(x_hat, x, latent)

2. Energy-Efficiency Score (EES)

from eqodec import energy_efficiency_score

ees = energy_efficiency_score(
    baseline_bytes=2.3e9,
    compressed_bytes=1.9e9,
    kg_co2=0.82
)

Higher values indicate more sustainable compression.


Use Cases

  • Neural image, video, and audio compression
  • Learned entropy models and autoencoders
  • Carbon-aware representation learning
  • Sustainability benchmarking of codecs
  • Research on green AI and efficient ML systems

Design Philosophy

EQODEC is intentionally:

  • Minimal - no datasets, no training loops
  • Composable - integrates into existing pipelines
  • Transparent - interpretable loss components
  • Research-first - aligned with reproducible evaluation

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

eqodec-0.3.0.tar.gz (3.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

eqodec-0.3.0-py3-none-any.whl (4.2 kB view details)

Uploaded Python 3

File details

Details for the file eqodec-0.3.0.tar.gz.

File metadata

  • Download URL: eqodec-0.3.0.tar.gz
  • Upload date:
  • Size: 3.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.7

File hashes

Hashes for eqodec-0.3.0.tar.gz
Algorithm Hash digest
SHA256 a836a4d245e89b1da397dd722dc49fdd2b2752e0bfc01ab0622b7940b03ed0a7
MD5 dd8e8c35f12c47b030ba217c2d8b5dcc
BLAKE2b-256 4f4d39a50dde31cece1fae80b7d0ff986731cd2f8ab2c2959d3ed16b4999f185

See more details on using hashes here.

File details

Details for the file eqodec-0.3.0-py3-none-any.whl.

File metadata

  • Download URL: eqodec-0.3.0-py3-none-any.whl
  • Upload date:
  • Size: 4.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.7

File hashes

Hashes for eqodec-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 05932a434257e876e5a54e137a4fd115058325d2c5e12cb544d1b8f3dd7f8625
MD5 998551d28289dda6c8d0c1ec366c570a
BLAKE2b-256 881be3f90f6fe70122bc73c724b57c8422711edb21f8063a9b00299dd5d5ae75

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page