Skip to main content

Agentic memory with mental-model architecture. Deterministic, replay-safe substrate for cognitive-architecture agents: LinOSS temporal encoding, Hopfield attractor memory, mental-model and metacognitive primitives, strategy evolution, pluggable MemEvolve benchmark adapter. Contribution: byte-identical replay, immutable lineage, provider abstraction, composable operators.

Project description

Elume

PyPI version Python versions CI License: MIT

Agentic memory with mental-model architecture — a deterministic, replay-safe substrate for cognitive-architecture agents.

Elume is the cognitive substrate underneath an agent: long-horizon temporal encoding, attractor-based associative memory, mental-model primitives, metacognitive control, and deterministic strategy evolution — all behind clean provider boundaries, all replay-safe by construction.

Memory is the entry point. Mental modeling is the architecture. Replay-safety is the engineering contribution.

It integrates LinOSS-style temporal encoding (Rusch & Rus, ICLR 2025), Hopfield-style associative memory, mental-model and metacognitive record types, and a deterministic evolution substrate into one open-source stack. The contribution of Elume is not the invention of the underlying methods in isolation, but the engineering work required to combine them, adapt their codepaths, and make them operate coherently inside a single deterministic kernel.

What Elume is

Elume is a runtime cognitive substrate for agents that need to:

  • encode long trajectories with oscillatory state-space dynamics,
  • recover useful prior state through attractor-based associative recall,
  • maintain explicit mental models with predictions and revisions,
  • exercise metacognitive control over inference and action selection,
  • and evolve memory strategies over time, deterministically.

The full primitive set:

Layer Modules Role
Temporal encoding elume.linoss LinOSS solver, encoder, timing
Memory substrate elume.basins, elume.network Attractor field, Hopfield, self-modeling network
Mental modeling elume.models.mental_model, elume.cognition.mental_model MentalModel, BasinRelationship, PredictionTemplate, ModelPrediction, ModelRevision, mental-model subnetworks
Metacognition elume.models.metacognitive CognitiveCore, MetacognitiveParticle, MentalAction, ParticleType
Belief & cognition elume.models.belief, elume.models.cognitive, elume.cognition Belief states, cognitive events, deterministic thought competition, prior-gated cognition, curiosity homing
Evolution elume.evolution Strategy lifecycle, GA over immutable Strategy records
Determinism elume.envelope Canonical pre-image hashing, byte-identical replay
Integration elume.providers, elume.adapters, elume.embedders Provider contracts, MemEvolve cartridge, embedding protocols

In practice, Elume packages and stabilizes multiple upstream ideas plus original cognitive-architecture engineering so they can be used together as one substrate.

What Elume is not

Elume does not claim authorship of the original LinOSS, MemEvolve, or Hopfield-style memory ideas.

Instead, it is an open-source composition of these components, with the modifications, interfaces, and system-level fixes needed to make them work together in one usable framework.

Elume's evolution module is a deterministic, replay-safe genetic algorithm operating on immutable Strategy records through a provider boundary. The framing — agent memory as an evolvable population rather than policy weights — is adopted from MemEvolve (Zhang et al. 2025, arXiv:2512.18746). The implementation is original Elume work using standard GA primitives. What Elume contributes is the engineering substrate: byte-identical replay, immutable lineage, provider abstraction, and composable operator protocols.

What Elume created

Things that did not exist anywhere before this project:

  • The deterministic envelope (elume.envelope, v0.1) — a canonical pre-image (BLAKE2b-256) over operation inputs, RNG state, result, and provider snapshot, giving every cognitive op a byte-identical replay contract. Five reference operations registered today (belief embed, basin recall, thought competition, evolution step, self-model step).
  • The platform-tagged float-hash policyplatform_fingerprint() folded into the canonical pre-image so cross-platform replay drift surfaces as a hash mismatch by construction, not silent agreement on incidentally-matching bytes.
  • elume.adapters.memevolve.ElumeMemoryProvider (v0.2) — the first deterministic baseline in MemEvolve's --memory_provider list. Same seed, same input → byte-identical MemoryResponse per step.
  • cognition.curiosity_score as a replayable envelope op (v0.2) — Shannon-entropy + information-gain scoring wrapped with the same hash-equal replay contract as every other op. The math is ported from dionysus3 (credited below); the envelope wrap, the integration with run_gated_thought_competition, and the CuriosityPrior derivation are original Elume work.
  • Hyperevolution coupling (v0.2) — the wiring inside ElumeMemoryProvider that lets curiosity continuously re-acquire the search heading: provide_memory re-ranks basins by current information gain; take_in_memory updates a per-session BeliefBuffer from trajectory outcomes; the whole pattern toggles via one config key.
  • The kernel discipline — frozen records, successor semantics (.evolved(), .revised(), .with_status()), injected RNG, provider-boundary persistence, no framework dependencies. This discipline applied uniformly across LinOSS, basins, evolution, cognition, embedders, and providers is what makes the whole stack composable inside one Python package.

What Elume adopted from upstream is named in the Attribution section. What Elume created is everything in the bullets above.

Core composition

Elume combines:

  1. LinOSS-based temporal encoding for long-horizon trajectory representation.
  2. Attractor-based associative memory for content-addressable recall.
  3. Deterministic adaptive memory logic for improving memory behavior over time, with an optional curiosity homing signal.

These components are integrated into a shared memory pipeline for agentic learning.

Why Elume

  • Determinism — injected RNG, byte-identical replay within a platform fingerprint. Every retrieval decision can be audited.
  • Immutable records — frozen trajectory snapshots, belief states, and basin activations. Strategies evolve via successors, not mutation.
  • Provider boundary — storage is a protocol contract, not an implementation. Swap backends without touching cognition code.
  • No framework lock-in — no FastAPI, Graphiti, or agent runtime in the core. Adapters live in consumers.
  • Cross-platform float-hash policyplatform_fingerprint() is folded into the canonical hash pre-image. Cross-platform drift is a visible mismatch, not silent corruption.
  • Curiosity-driven hyperevolution — the optional curiosity homing signal biases memory retrieval toward entropy-reducing directions, turning uniform-random search into goal-directed exploration.

MemEvolve cartridge

Elume v0.2.0 ships a BaseMemoryProvider-conformant adapter so MemEvolve (bingreeky/MemEvolve) can benchmark Elume against its 11 existing baselines. Two-line registration, then:

python run_flash_searcher_mm_gaia.py --memory_provider elume --sample_num 5

See docs/adapters/memevolve.md for the full install guide, determinism guarantee, and hyperevolution mode.

Why Elume exists

Many memory systems are strong in isolation but difficult to combine in practice.

Elume exists to make these components interoperable: to unify their interfaces, reconcile assumptions, patch incompatibilities, and provide a coherent open-source implementation that others can inspect, use, and build on.

Attribution

Elume builds directly on upstream work and code associated with LinOSS, MemEvolve, Hopfield-style associative memory, and attractor / neural-field context-engineering ideas.

Specific upstream sources:

  • LinOSS — Oscillatory State-Space Models — T. Konstantin Rusch and Daniela Rus, International Conference on Learning Representations (ICLR), 2025. Temporal encoding substrate and oscillator dynamics inside the basin field.
  • MemEvolve — Meta-Evolution of Agent Memory Systems — Guibin Zhang, Haotian Ren, Chong Zhan, Zhenhong Zhou, Junhao Wang, He Zhu, Wangchunshu Zhou, and Shuicheng Yan, arXiv preprint 2512.18746, 2025. Source of the evolvable-memory-population framing. The BaseMemoryProvider cartridge interface and shaping helpers in src/elume/adapters/memevolve/shaping.py are adapted from the bingreeky/MemEvolve codebase (Apache-2.0), with HTTP/HMAC stripped.
  • Context Engineering: Beyond Prompt Engineering — Context Engineering Contributors (maintained by David Kimai), github.com/davidkimai/context-engineering (MIT), 2025. Source of the attractor-based neural-field model at the core of Elume's memory layer — specifically 00_foundations/08_neural_fields_foundations.md, 00_foundations/11_emergence_and_attractor_dynamics.md, 40_reference/attractor_dynamics.md, and the memory-attractor protocol shells in 60_protocols/shells/.
  • Hopfield-style associative memory — Hopfield (PNAS 1982); textbook synthesis from Anderson (2014, Ch. 13); capacity bound from Amit, Gutfreund & Sompolinsky (1985). Classical mathematical substrate for discrete pattern storage inside the basin subsystem.
  • Source codebasedionysus3, a research cognitive architecture. Every module in elume/ was originally developed there. Elume relocates the kernel math with verbatim semantics and strips project-specific glue so the result is a pure library. The Shannon-entropy + information-gain mechanism in src/elume/cognition/curiosity.py is ported from dionysus3's CuriosityDriveService (api/services/mosaeic_self_discovery.py and arousal_system_service.py).

BibTeX entries for all upstream academic citations are in CITATIONS.bib. Please cite the upstream sources in any published work that uses Elume.

Status

Elume is an open-source integration project under active development.

Twenty-five tracks landed: kernel bootstrap, core data models, LinOSS solver + timing, Hopfield network, basin field engine, attractor basin core, embedder protocol, provider contracts, the evolution engine, the self-modeling network engine, immutable cognitive record types, immutable mental-model domain records, immutable metacognitive control records, prior hierarchy records, mental-model subnetworks, the cognitive event protocol, cognitive-event embedders, immutable thought-level records, immutable neuronal-packet records, deterministic thought competition, prior-gated cognition, the MemEvolve cartridge, curiosity homing device, and hyperevolution wiring. Track 007 was retired after source review showed it was framed against the wrong dionysus3 concept. 1177 tests passing, ruff clean.

Phase 2 is complete through the prior gate: Track 011 shipped elume.network, Tracks 014, 016, 018, 021, and 022 landed the minimal cognition gate from MentalModel through LinOSSEncoder, Tracks 012, 013, and 019 landed immutable thought and packet records plus deterministic EFE competition, and Tracks 015, 017, and 020 landed metacognitive control, generic priors, and prior-gated cognition. See conductor/tracks.md.

Phase 3 is complete: the MemEvolve cartridge (elume.adapters.memevolve), curiosity homing (elume.cognition.curiosity), and hyperevolution wiring now connect Elume's deterministic substrate to MemEvolve's outer evolutionary loop.

Archon-style deterministic-harness adoption is complete for v0.1.0. The kernel has injected RNGs, frozen trajectory metadata, provider snapshots, and an elume.envelope v0 operation registry covering belief embedding, evolution step, thought competition, self-model stepping, Hopfield recall, and (v0.2.0) curiosity scoring. Cross-platform float-hash policy is documented in docs/archon-readiness/21-float-hash-policy.md.

Install

Requires Python >=3.11.

pip install elume

Quickstart (development)

For local development, use uv and an editable install:

# from the repo root
uv venv .venv
uv pip install -e ".[dev]"

# run the test suite
.venv/bin/pytest

# lint
.venv/bin/ruff check src tests reference_service/src

# optional: reference service demo
uv pip install -e ./reference_service
PYTHONPATH=src:reference_service/src python -m reference_service

Layout

elume/
├── src/
│   └── elume/
│       ├── basins/      # Hopfield + basin field dynamics (neural fields model)
│       ├── cognition/   # mental-model subnetworks + typed cognitive events
│       ├── embedders/   # event -> trajectory projection protocols
│       ├── linoss/      # oscillatory state-space primitives (solver, timing, encoder)
│       ├── network/     # self-modeling network substrate for Phase 2 cognition
│       ├── evolution/   # successor-based strategy evolution
│       ├── providers/   # storage contracts + reference provider
│       ├── envelope/    # deterministic replay envelope + reference ops
│       ├── adapters/    # provider adapters (memevolve cartridge)
        └── models/      # beliefs, strategies, trajectories, cognitive + thought records
├── reference_service/   # runnable CLI/FastAPI demo (separate package, optional)
├── tests/
│   ├── unit/            # unit tests for kernel modules
│   ├── contract/        # contract tests consumers re-run against their impls
│   └── integration/     # end-to-end composition tests across subsystems
└── conductor/           # spec-driven development docs and tracks

Consuming Elume

Downstream projects pin a versioned PyPI release:

pip install elume==0.1.0

For co-development against an unreleased branch, an editable install also works:

# from the consumer repo (e.g. dionysus3)
pip install -e /path/to/elume

Principles

  • Integration, not invention. The underlying techniques are open source or openly published; Elume's work is bringing them together.
  • Kernel, not application. Reusable mechanism only. Adapters and policies live in consumers.
  • No framework lock-in. No FastAPI, no Graphiti, no agent runtime in the core.
  • Pluggable storage. Providers are contracts, not implementations.
  • Reproducible. Deterministic where possible; evolution randomness goes through an injectable RNG.
  • Contract tests as the regression net. Consumers re-run tests/contract/ against their provider implementations.
  • The past is frozen. Trajectory records, belief snapshots, and basin activations are immutable. Strategies evolve by producing successors, not by mutating in place.

On the name

Elume is the brand form. ELUME works as an acronym mnemonic — Evolvable Long-horizon Unified Mental-model Engine.

For public descriptors:

  • Short: Cognitive Substrate for Agents
  • Memory-first framing (for SaaS/category fit): Agentic Memory with Mental-Model Architecture
  • Technical long form: Deterministic, Replay-Safe Substrate for Cognitive-Architecture Agents
  • Tagline: Agentic memory with mental-model architecture — a deterministic, replay-safe substrate for cognitive-architecture agents.

License

MIT. Compatible with Context-Engineering's MIT license and all upstream components.

See ATTRIBUTION.md and conductor/product.md for the full attribution and product specifications.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

elume-0.3.0.tar.gz (612.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

elume-0.3.0-py3-none-any.whl (132.7 kB view details)

Uploaded Python 3

File details

Details for the file elume-0.3.0.tar.gz.

File metadata

  • Download URL: elume-0.3.0.tar.gz
  • Upload date:
  • Size: 612.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.13

File hashes

Hashes for elume-0.3.0.tar.gz
Algorithm Hash digest
SHA256 be126fa3e4a51005901111d4f44e560d49acfbe6b78987b951d8bc9c00fb6ce7
MD5 4c4b060188380f99901efe8b35fc3a70
BLAKE2b-256 b9280602f863515a620f64d6545ad13f5a3b33e3d885a0286ef0ae4988585892

See more details on using hashes here.

File details

Details for the file elume-0.3.0-py3-none-any.whl.

File metadata

  • Download URL: elume-0.3.0-py3-none-any.whl
  • Upload date:
  • Size: 132.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.13

File hashes

Hashes for elume-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 f62b14e2b9a1cac1365ae62c4b620e0d31de667c271d6a061c376a75c2e6d950
MD5 389fd00b5982534963012afd15c0c1d5
BLAKE2b-256 c4cf04025398c4a3b2079a4c46ed968882c51337585d003880e914e09d6879be

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page