Skip to main content

PsiLogic: An Active Cancellation Optimizer for Deep Neural Networks

Project description

ΨLogic

Active Cancellation Optimizer for Deep Learning

PyPI version Python License: MIT

dΨ/dt = -iĤ·Ψ  −  γ·P·tanh(S_info)·Ψ
         └──────┘   └─────────────────┘
          Gradient   Active Cancellation

ΨLogic is a PyTorch optimizer with a self-regulating damping term that fires when the model is confused and vanishes when it converges. Tested against Adam, AdamW, and SGD across images, text, and audio — on real hardware.


Benchmark Results

All experiments use identical weight initialization, identical CosineAnnealingLR scheduler, and gradient clipping. Full raw logs: logs.md

🖼 Images — CIFAR-10 · ResNet-18

30 epochs (4 optimizers):

Epoch Adam AdamW SGD ΨLogic
1 54.17% 51.66% 46.07% 56.05% ← leads
5 79.84% 83.00% 68.66% 82.22%
10 83.64% 87.22% 76.29% 86.85%
15 88.15% 89.69% 79.47% 89.58%
20 91.68% 91.54% 82.09% 91.67%
25 92.75% 92.51% 83.54% 92.38%
30 93.19% 92.62% 83.64% 92.60%

100 epochs (Adam vs ΨLogic, 2 independent runs):

Epoch Adam (local) ΨLogic (local) Adam (Colab) ΨLogic (Colab)
1 52.98% 60.68% +7.70% 56.46% 54.18%
10 82.96% 87.70% +4.74% 83.54% 87.36% +3.82%
30 89.70% 91.68% +1.98% 88.78% 91.00% +2.22%
50 90.90% 92.21% +1.31% 91.46% 92.11% +0.65%
70 92.50% 93.16% +0.66% 92.35% 92.82% +0.47%
90 93.39% 93.34% 93.25% 93.58% +0.33%
100 93.67% 93.59% 93.65% 93.69% ✓

ΨLogic leads Adam at every epoch from 1 to 80 in both runs. The final gap is ≤0.08% — statistically negligible, while the early-phase advantage is consistent and large (+3–7% at epochs 1–10).


📝 Text — AG News · Transformer (2-layer, d=128) · 10 epochs

Epoch Adam AdamW SGD ΨLogic
1 92.16%★ 92.28%★ 89.71% 92.11%
3 91.76% 91.84% 90.96% 92.14%★
5 90.84% 91.16% 91.12% 91.37% ← leads
7 91.17% 91.11% 91.33%★ 91.26%
10 91.07% 91.30% 91.24% 91.46% ← leads

ΨLogic leads all optimizers at epoch 5 and epoch 10 on real NLP data.


🔊 Audio — Google SpeechCommands · CNN + Bidirectional GRU · 15 epochs · 35 classes

Epoch Adam AdamW SGD ΨLogic
1 80.79% 82.87% 41.49% 81.27%
3 89.93% 91.64% 59.89% 90.43%
5 92.34% 92.91% 77.51% 92.57%
8 92.98% 93.89% 83.54% 93.74% ← 2nd
10 94.06% 94.57% 88.78% 94.76% ← leads
12 94.98% 95.10% 89.83% 95.11% ← leads
15 95.50%★ 95.35% 90.81% 95.26%

ΨLogic beats AdamW at epochs 10 and 12. Final gap to Adam: 0.24%.


Install

pip install psilogic

Drop-in Replacement for Adam

# Before
from torch.optim import Adam
optimizer = Adam(model.parameters(), lr=1e-3)

# After — one line
from psilogic import PsiLogic
optimizer = PsiLogic(model.parameters(), lr=1e-3)

Training loop is 100% identical. No other code changes needed.


The Formula

Ψ_{t+1} = Ψ_t  −  η · m̂_t / (√v̂_t + ε)  −  η · γ · P · tanh(S_t) · Ψ_t
                   ─────────────────────────    ─────────────────────────────
                        Adaptive gradient             Active Cancellation

Where S_t is an EMA of gradient norms — the chaos detector:

S_t = (1 − α) · S_{t-1}  +  α · ‖∇_t‖₂
Phase S_t tanh(S_t) Behaviour
Early training — high loss large → 1.0 Strong cancellation, damps erroneous weights
Late training — converging small → 0.1 Minimal interference
Fully converged ≈ 0 → 0.0 Term vanishes completely

API

from psilogic import PsiLogic

optimizer = PsiLogic(
    params,
    lr           = 1e-3,   # learning rate η
    gamma        = 0.05,   # cancellation strength γ
    p_ext        = 1.2,    # external power scale P
    momentum     = 0.9,    # first moment decay β₁
    beta2        = 0.999,  # second moment decay β₂
    eps          = 1e-8,   # numerical stability ε
    ema_alpha    = 0.05,   # chaos detector EMA decay α
    weight_decay = 0.0,    # L2 regularization
)

Recommended Settings

Task lr gamma ema_alpha
Image classification 1e-3 0.05 0.05
NLP / Transformer 5e-4 0.03 0.03
Audio classification 1e-3 0.05 0.05

Reproduce

git clone https://github.com/Troxter222/psilogic
pip install torch torchvision torchaudio datasets

python examples/resnet_benchmark.py       # ResNet-18 · 100 epochs
python examples/multimodal_benchmark.py   # Images + Text + Audio

License

MIT © 2025


"Fire hard when wrong. Disappear when right."

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

psilogic-0.1.0.tar.gz (6.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

psilogic-0.1.0-py3-none-any.whl (6.3 kB view details)

Uploaded Python 3

File details

Details for the file psilogic-0.1.0.tar.gz.

File metadata

  • Download URL: psilogic-0.1.0.tar.gz
  • Upload date:
  • Size: 6.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.13

File hashes

Hashes for psilogic-0.1.0.tar.gz
Algorithm Hash digest
SHA256 e99c0d793fd0ea5120d4cfc2346faa479ec3ff07c2059342cb2105314d25a724
MD5 8cd55029e7e27e3d0f012b207fe8326e
BLAKE2b-256 78beec300337df46ccde4d0322114731db1be2854518369dd65825ab709c896e

See more details on using hashes here.

File details

Details for the file psilogic-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: psilogic-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 6.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.13

File hashes

Hashes for psilogic-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 c7a7663421d0683c8735ef7dfc035d33a0fc6b611db577f1df769d5bc50f3086
MD5 1730c9675dc86dc032710cacfd9d6c5a
BLAKE2b-256 cb25f19a55705ddf5403cea198f97b2724a53f2fd3488d1a12477ddb3320cf1c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page