Skip to main content

No project description provided

Project description

EQODEC

EQODEC is a lightweight Python package that provides carbon-aware loss functions and energy-efficiency evaluation metrics for neural compression and representation learning systems.

It enables researchers and engineers to explicitly account for carbon cost during model training and evaluation, rather than treating emissions as a post-hoc reporting metric.

Optimize quality, bitrate, and carbon cost — jointly.


Motivation

The rapid growth of neural compression, generative models, and large-scale inference has significantly increased the energy and carbon footprint of data storage and transmission systems. While existing codecs optimize for rate–distortion trade-offs, they do not account for environmental impact during training or inference.

EQODEC addresses this gap by:

  • Integrating carbon awareness directly into the training objective
  • Providing a standardized Energy-Efficiency Score (EES) for fair evaluation
  • Remaining model-agnostic and framework-lightweight

Key Features

  • Carbon-Aware Loss Function

    • Drop-in replacement for standard rate–distortion losses
    • Supports reconstruction, bitrate, and carbon regularization terms
  • Energy-Efficiency Score (EES)

    • Quantifies storage savings per unit carbon cost
    • Suitable for neural codecs, traditional codecs, and hybrid pipelines
  • Framework-Agnostic Design

    • Works with any neural compression architecture
    • Minimal assumptions about latent representations
  • CodeCarbon Integration

    • Uses real-world carbon intensity estimates
    • Graceful fallback when regional data is unavailable

Installation

pip install eqodec

Requirements

  • Python ≥ 3.8
  • PyTorch
  • NumPy
  • CodeCarbon

Quick Start

1. Carbon-Aware Loss Function

import torch
from eqodec import CarbonAwareLoss, get_local_carbon_intensity

carbon_intensity = get_local_carbon_intensity()

criterion = CarbonAwareLoss(
    lambda_recon=5.0,
    lambda_rate=0.5,
    lambda_carbon=0.005,
    carbon_intensity=carbon_intensity
)

loss, metrics = criterion(x_hat, x, latent)

Loss formulation:

[ \mathcal{L} = \lambdar \mathcal{L}{\text{recon}}

  • \lambdab \mathcal{L}{\text{rate}}
  • \lambdac \mathcal{L}{\text{carbon}} ]

2. Energy-Efficiency Score (EES)

from eqodec import energy_efficiency_score

ees = energy_efficiency_score(
    baseline_bytes=2.3e9,
    compressed_bytes=1.9e9,
    kg_co2=0.82
)

Definition:

[ \text{EES} = \frac{\text{GB Saved}}{\text{kgCO}_2} ]

Higher values indicate more sustainable compression.


Use Cases

  • Neural image, video, and audio compression
  • Learned entropy models and autoencoders
  • Carbon-aware representation learning
  • Sustainability benchmarking of codecs
  • Research on green AI and efficient ML systems

Design Philosophy

EQODEC is intentionally:

  • Minimal – no datasets, no training loops
  • Composable – integrates into existing pipelines
  • Transparent – interpretable loss components
  • Research-first – aligned with reproducible evaluation

Relation to the EQODEC Framework

This package is the modular, reusable extraction of the EQODEC research framework:

EQODEC: A Carbon-Aware Neural Compression Framework

The full experimental pipeline (ConvGRU architecture, FFmpeg evaluation, Vimeo-90K benchmarking) is maintained separately in the research repository.


Contact

For questions, issues, or research collaboration:

Ian Jure Macalisang GitHub: https://github.com/ianjure Email: ianjuremacalisang2.com

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

eqodec-0.2.0.tar.gz (4.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

eqodec-0.2.0-py3-none-any.whl (4.6 kB view details)

Uploaded Python 3

File details

Details for the file eqodec-0.2.0.tar.gz.

File metadata

  • Download URL: eqodec-0.2.0.tar.gz
  • Upload date:
  • Size: 4.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.7

File hashes

Hashes for eqodec-0.2.0.tar.gz
Algorithm Hash digest
SHA256 e66ddda707651d960e4482ba6a1aaf757ad16e1280ea8a84bddb433eefe5ae71
MD5 64c187a5d65012b91f72eadd09b16cd4
BLAKE2b-256 87b9fdb9c487295507a8be66f0f48b8ca611231fa9bb65819a23002418ca5a5f

See more details on using hashes here.

File details

Details for the file eqodec-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: eqodec-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 4.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.7

File hashes

Hashes for eqodec-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 9ad088e46b5afb2609452881ab1ff89bbba3470d1b5fb7d4d340ce4ca7677d57
MD5 b8dddb5fd8186ced95d382e0b7120dbb
BLAKE2b-256 b094df3d4dc670f703f134c72a60745da72c93ef1fd2ef0179593465d646c1f5

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page