Epistemic Weight Engine — Pre-update gating for noise-robust AI learning
Project description
ewe-gate
Epistemic Weight Engine — A plug-and-play pre-update gating mechanism for noise-robust AI learning.
What is EWE?
Current AI systems treat every training sample equally — regardless of whether the information is reliable, novel, or approval-biased.
EWE adds a gate before every parameter update that asks three questions:
| Layer | Signal | Question |
|---|---|---|
| I(x) | Gradient magnitude | Does this sample actually matter? |
| R(x) | Evidence vs approval | Is this label reliable or just popular? |
| P(x) | Loss deviation | Is this genuinely new information? |
Only samples that pass the gate trigger a parameter update. Everything else is skipped.
Result on CIFAR-10N with 40% real human label noise:
| Method | Accuracy |
|---|---|
| Standard Training | 72.37% |
| EWE (ours) | 79.36% (+6.99%) |
| EWE + GCE (ours) | 84.33% (+11.96%) |
Install
pip install ewe-gate
Quick Start
Option 1 — Just the gate (most flexible)
import torch
import torch.nn as nn
from ewe import EWEGate
model = YourModel()
optimizer = torch.optim.SGD(model.parameters(), lr=0.1)
criterion = nn.CrossEntropyLoss(reduction='none')
gate = EWEGate()
for epoch in range(50):
for x, y in dataloader:
optimizer.zero_grad()
outputs = model(x)
losses = criterion(outputs, y)
# EWE filters which samples update parameters
filtered_loss = gate.filter_losses(losses, outputs.detach())
if filtered_loss is not None:
filtered_loss.backward()
optimizer.step()
print(f"Epoch {epoch} | Accept rate: {gate.acceptance_rate:.1%}")
Option 2 — EWETrainer (simplest)
from ewe import EWETrainer
import torch.nn as nn
trainer = EWETrainer(
model=model,
optimizer=optimizer,
criterion=nn.CrossEntropyLoss(reduction='none'),
)
for epoch in range(50):
loss, acc, rate = trainer.train_epoch(train_loader)
val_acc = trainer.evaluate(val_loader)
print(f"Epoch {epoch} | Loss: {loss:.3f} | Acc: {acc:.1f}% | Accept: {rate:.1%}")
Option 3 — EWE + GCE combined (best results)
from ewe import EWEGate
from ewe import GCELoss
gate = EWEGate()
criterion = GCELoss(q=0.7) # Noise-robust loss
for x, y in dataloader:
optimizer.zero_grad()
outputs = model(x)
losses = criterion(outputs, y, reduction='none')
filtered_loss = gate.filter_losses(losses, outputs.detach())
if filtered_loss is not None:
filtered_loss.backward()
optimizer.step()
Configuration
gate = EWEGate(
alpha=0.45, # Weight for Impact module
beta=0.40, # Weight for Reality Alignment (most important)
gamma=0.15, # Weight for Paradigm Shift
k=0.25, # Gate sensitivity (higher = stricter)
lam=0.5, # Approval penalty strength
ema_decay=0.99, # Loss baseline decay
)
Key parameter: k
k=0.10→ ~70% acceptance (loose gate)k=0.25→ ~60% acceptance (default, balanced)k=0.50→ ~50% acceptance (strict gate)
Monitor the gate
gate = EWEGate()
# After training
print(f"Acceptance rate: {gate.acceptance_rate:.1%}")
print(f"Suppression rate: {gate.suppression_rate:.1%}")
# Reset stats between experiments
gate.reset_stats()
Available loss functions
from ewe import GCELoss, LabelSmoothingLoss
# Generalised Cross-Entropy (Zhang & Sabuncu 2018)
criterion = GCELoss(q=0.7)
# Label Smoothing (Szegedy et al. 2016)
criterion = LabelSmoothingLoss(smoothing=0.1)
Citation
If you use EWE in your research please cite:
@article{purohit2026ewe,
title={Epistemic Weight Engine (EWE): A Framework for Signal-Reliability-Weighted
Learning in Artificial Neural Systems, with Multi-Dataset Experimental Evaluation},
author={Purohit, Maheep},
journal={ACM Transactions on Intelligent Systems and Technology},
year={2026},
note={Manuscript ID: TIST-2026-03-0289},
doi={10.5281/zenodo.18940011}
}
Paper and Code
- Paper (preprint): https://doi.org/10.5281/zenodo.18940011
- Experiments: https://github.com/maheeppurohit/epistemic-weight-engine
- Contact: purohitmaheep@gmail.com
Author
Maheep Purohit Independent Researcher, Bikaner, Rajasthan, India Patent Applicant: Adaptive Intelligent Pipeline Integrity System (filed 2025)
This research was conducted entirely independently without institutional affiliation, laboratory access, external funding, or academic supervision.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file ewe_gate-0.1.0.tar.gz.
File metadata
- Download URL: ewe_gate-0.1.0.tar.gz
- Upload date:
- Size: 4.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a15ebdb955f925fd05da886c50074ef44deb7d6cfceed6ced9b3a68750adfa78
|
|
| MD5 |
d665eb374818831ef023dbfc8b33a97c
|
|
| BLAKE2b-256 |
08c3b89f53a86017663d75dec7dd05a56c0bfa92180b8406dce9b15c47946421
|
File details
Details for the file ewe_gate-0.1.0-py3-none-any.whl.
File metadata
- Download URL: ewe_gate-0.1.0-py3-none-any.whl
- Upload date:
- Size: 3.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b5d50514de90ebfb0594f775396c1aec711ba946f44fe5beb1d7a8a1d80005f7
|
|
| MD5 |
a9fecf92f33767fb902677611539d032
|
|
| BLAKE2b-256 |
c93be0e4c8353ecbada73e79b6e00ff0308674f0d676fe6503b99b24416e9821
|