Skip to main content

A backend-neutral kernel of predictive primitives for descendant systems.

Project description

Decepticons

Decepticons

PyPI CI License: MIT Python Status

Website · Architecture · Kernel matrix · Examples · Related work

O(n) attention is deception. A backend-neutral kernel of predictive primitives — substrates, memory, gating, routing, readouts — that downstream systems combine into trained models without forking the kernel itself.

decepticons is the shared mechanism layer for predictive descendants: substrate dynamics, controller summaries, memory primitives, feature views, readouts, and runtime helpers extracted from a broader experiment family so downstream systems can specialize without forking the kernel.

Install

Requires Python ≥ 3.11. The kernel itself only needs numpy.

If you don't already have a Python virtual environment, make one first. Modern Linux distributions block pip from writing into the system Python (PEP 668), so a venv is the standard path:

python3 -m venv .venv
source .venv/bin/activate      # Windows: .venv\Scripts\activate

Then install from PyPI:

pip install decepticons

For the model backends:

pip install "decepticons[torch]"   # PyTorch CausalBankModel + routed readouts
pip install "decepticons[metal]"   # Apple MLX backend (Apple Silicon)

To leave the venv when you're done: deactivate.

For development from source (clone + editable install + run tests):

git clone https://github.com/asuramaya/decepticons
cd decepticons
python3 -m venv .venv
source .venv/bin/activate
pip install -e ".[test]"
pytest -v

Quickstart

from decepticons import ByteCodec, ByteLatentPredictiveCoder

text = "predictive coding likes repeated structure.\n" * 64
model = ByteLatentPredictiveCoder()
report = model.fit(text)

prompt = ByteCodec.encode_text("predictive ")
sample = model.generate(prompt, steps=40, greedy=True)

print(report.train_bits_per_byte)
print(ByteCodec.decode_text(sample))

CLI:

decepticons fit --input ./corpus.txt --prompt "predictive " --generate 80

A complete worked example lives in examples/quickstart.py. For descendant-shaped projects, see examples/projects/.

What's in the kernel

Area Highlights
Substrates recurrent, delay, linear-memory, oscillatory, mixed, hierarchical
Control controller summaries, pathway gates, summary routing, hormone modulation, predictive surprise
Memory exact-context, n-gram, statistical-backoff, online n-gram, cache views
Views byte-latent, hierarchical, linear-memory, sampled multiscale, bridge features, probability diagnostics
Readouts ridge, frozen-readout expert, sampled multiscale, GRU recurrent, routed squared-ReLU
Adapters causal predictive, oracle analysis, bridge export, noncausal reconstructive, paired teacher/export
Runtime traces, fit reports, rollout evaluation, transfer probes, train-mode checkpoints, artifact accounting
Causal-bank family metadata + deterministic substrate construction (frozen / learnable-decays / learnable-mixing / learned-recurrence / gated-retention)
Backends numpy-only kernel; PyTorch and MLX CausalBankModel implementations

Full capability matrix: docs/kernel_matrix.md.

Architecture

decepticons  ──→  chronohorn  ──→  heinrich
  kernel          runtime          forensics
 (this repo)   training · fleet   geometry · audit

Three layers inside this repo:

  1. Kernelsrc/decepticons/. Public package. Reusable mechanisms only.
  2. Project descendantsexamples/projects/. Pressure-tests the kernel boundary with concrete descendant shapes (causal · oracle · bridge · noncausal · byte-latent).
  3. Toolingexamples/tools/. Development and analysis scripts. Not part of the public package.

Code moves into src/ only when all three hold:

  1. It is a mechanism, not a project policy.
  2. At least two descendants want the same thing.
  3. The generalized API is simpler than keeping the duplication.

This rule is the main defense against turning the kernel into a renamed collection of branches. Full detail in docs/architecture.md and the boundary against the runtime in docs/chronohorn_boundary.md.

Causality is verified

All substrate modes are verified by tests/test_causality.py: it feeds two identical sequences up to position t, different after t. If logits at position t differ, causality is violated and CI fails. Modes verified: frozen, learnable_mixing, learnable_decays, selective scan augment (state_dim > 0), readout_bands, routed experts.

decepticons never imports its descendants — enforced by an AST scan in tests/test_dependency_firewall.py.

Docs

Scope

This is a research kernel and reference implementation. The current pressure from descendants is O(n) causal-bank architecture search — cheap ablation lanes to separate mechanisms before promotion, with scale and context survival checked in the descendant runtime.

It is not a frontier runtime, a production compression stack, or a benchmark claim. It exists to keep the shared mechanism layer reusable and legible.

Contributing

See CONTRIBUTING.md. Issues and pull requests welcome.

License

MIT — see LICENSE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

decepticons-0.1.3.tar.gz (187.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

decepticons-0.1.3-py3-none-any.whl (157.4 kB view details)

Uploaded Python 3

File details

Details for the file decepticons-0.1.3.tar.gz.

File metadata

  • Download URL: decepticons-0.1.3.tar.gz
  • Upload date:
  • Size: 187.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for decepticons-0.1.3.tar.gz
Algorithm Hash digest
SHA256 2036bf2acde55c270a9509b6e5c4d4bd1aa551354eb26f09dc7a4e552eaf7186
MD5 6574bd2bd3d0b673b51b5095763fae63
BLAKE2b-256 bbbe439362e9a52257d7da341fdb9d662e413186ce13a502d6b8ccda781904ef

See more details on using hashes here.

Provenance

The following attestation bundles were made for decepticons-0.1.3.tar.gz:

Publisher: release.yml on asuramaya/decepticons

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file decepticons-0.1.3-py3-none-any.whl.

File metadata

  • Download URL: decepticons-0.1.3-py3-none-any.whl
  • Upload date:
  • Size: 157.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for decepticons-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 d202f31540db407f54a21629a19c9d26c9ec68be185841c593242d03ca29201e
MD5 1d325eabf1d31b226592dc4344e32b6c
BLAKE2b-256 b4f0eeee879c8c5aacec9a56d310178d7a71e50434a28c8e3e0a5b5d73b874d7

See more details on using hashes here.

Provenance

The following attestation bundles were made for decepticons-0.1.3-py3-none-any.whl:

Publisher: release.yml on asuramaya/decepticons

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page