Elume — an open-source agentic memory engine for long-horizon adaptive learning. An integration layer bringing together LinOSS oscillatory state-space models, attractor-based associative memory, the MemEvolve cartridge (ElumeMemoryProvider), and curiosity homing (shannon-entropy information-gain steering) into a single unified memory kernel. The contribution is integration, not invention.
Project description
Elume
An open-source agentic memory engine for long-horizon adaptive learning.
Elume brings together existing memory and sequence-modeling components into a single working system for long-horizon agents.
It integrates LinOSS-style long-horizon temporal encoding (Rusch & Rus, ICLR 2025), attractor-based associative memory, and a deterministic adaptive memory substrate into one open-source stack. The contribution of Elume is not the invention of these underlying methods in isolation, but the engineering work required to combine them, adapt their codepaths, and make them operate coherently in a unified memory system.
What Elume is
Elume is an integration layer and runtime memory stack for agents that need to:
- encode long trajectories,
- recover useful prior state through associative recall,
- and adapt memory behavior over time.
In practice, Elume packages and stabilizes multiple upstream ideas and implementations so they can be used together as a single agent memory engine.
What Elume is not
Elume does not claim authorship of the original LinOSS, MemEvolve, or Hopfield-style memory ideas.
Instead, it is an open-source composition of these components, with the modifications, interfaces, and system-level fixes needed to make them work together in one usable framework.
Elume's evolution module is a deterministic, replay-safe genetic algorithm operating on immutable Strategy records through a provider boundary. The framing — agent memory as an evolvable population rather than policy weights — is adopted from MemEvolve (Zhang et al. 2025, arXiv:2512.18746). The implementation is original Elume work using standard GA primitives. What Elume contributes is the engineering substrate: byte-identical replay, immutable lineage, provider abstraction, and composable operator protocols.
What Elume created
Things that did not exist anywhere before this project:
- The deterministic envelope (
elume.envelope, v0.1) — a canonical pre-image (BLAKE2b-256) over operation inputs, RNG state, result, and provider snapshot, giving every cognitive op a byte-identical replay contract. Five reference operations registered today (belief embed, basin recall, thought competition, evolution step, self-model step). - The platform-tagged float-hash policy —
platform_fingerprint()folded into the canonical pre-image so cross-platform replay drift surfaces as a hash mismatch by construction, not silent agreement on incidentally-matching bytes. elume.adapters.memevolve.ElumeMemoryProvider(v0.2) — the first deterministic baseline in MemEvolve's--memory_providerlist. Same seed, same input → byte-identicalMemoryResponseper step.cognition.curiosity_scoreas a replayable envelope op (v0.2) — Shannon-entropy + information-gain scoring wrapped with the same hash-equal replay contract as every other op. The math is ported from dionysus3 (credited below); the envelope wrap, the integration withrun_gated_thought_competition, and theCuriosityPriorderivation are original Elume work.- Hyperevolution coupling (v0.2) — the wiring inside
ElumeMemoryProviderthat lets curiosity continuously re-acquire the search heading:provide_memoryre-ranks basins by current information gain;take_in_memoryupdates a per-sessionBeliefBufferfrom trajectory outcomes; the whole pattern toggles via one config key. - The kernel discipline — frozen records, successor semantics
(
.evolved(),.revised(),.with_status()), injected RNG, provider-boundary persistence, no framework dependencies. This discipline applied uniformly across LinOSS, basins, evolution, cognition, embedders, and providers is what makes the whole stack composable inside one Python package.
What Elume adopted from upstream is named in the Attribution section. What Elume created is everything in the bullets above.
Core composition
Elume combines:
- LinOSS-based temporal encoding for long-horizon trajectory representation.
- Attractor-based associative memory for content-addressable recall.
- Deterministic adaptive memory logic for improving memory behavior over time, with an optional curiosity homing signal.
These components are integrated into a shared memory pipeline for agentic learning.
Why Elume
- Determinism — injected RNG, byte-identical replay within a platform fingerprint. Every retrieval decision can be audited.
- Immutable records — frozen trajectory snapshots, belief states, and basin activations. Strategies evolve via successors, not mutation.
- Provider boundary — storage is a protocol contract, not an implementation. Swap backends without touching cognition code.
- No framework lock-in — no FastAPI, Graphiti, or agent runtime in the core. Adapters live in consumers.
- Cross-platform float-hash policy —
platform_fingerprint()is folded into the canonical hash pre-image. Cross-platform drift is a visible mismatch, not silent corruption. - Curiosity-driven hyperevolution — the optional curiosity homing signal biases memory retrieval toward entropy-reducing directions, turning uniform-random search into goal-directed exploration.
MemEvolve cartridge
Elume v0.2.0 ships a BaseMemoryProvider-conformant adapter so MemEvolve
(bingreeky/MemEvolve) can benchmark
Elume against its 11 existing baselines. Two-line registration, then:
python run_flash_searcher_mm_gaia.py --memory_provider elume --sample_num 5
See docs/adapters/memevolve.md for the full install guide, determinism guarantee, and hyperevolution mode.
Why Elume exists
Many memory systems are strong in isolation but difficult to combine in practice.
Elume exists to make these components interoperable: to unify their interfaces, reconcile assumptions, patch incompatibilities, and provide a coherent open-source implementation that others can inspect, use, and build on.
Attribution
Elume builds directly on upstream work and code associated with LinOSS, MemEvolve, Hopfield-style associative memory, and attractor / neural-field context-engineering ideas.
Specific upstream sources:
- LinOSS — Oscillatory State-Space Models — T. Konstantin Rusch and Daniela Rus, International Conference on Learning Representations (ICLR), 2025. Temporal encoding substrate and oscillator dynamics inside the basin field.
- MemEvolve — Meta-Evolution of Agent Memory Systems — Guibin Zhang, Haotian Ren, Chong Zhan, Zhenhong Zhou, Junhao Wang, He Zhu, Wangchunshu Zhou, and Shuicheng Yan, arXiv preprint 2512.18746, 2025. Source of the evolvable-memory-population framing. The
BaseMemoryProvidercartridge interface and shaping helpers insrc/elume/adapters/memevolve/shaping.pyare adapted from the bingreeky/MemEvolve codebase (Apache-2.0), with HTTP/HMAC stripped. - Context Engineering: Beyond Prompt Engineering — Context Engineering Contributors (maintained by David Kimai), github.com/davidkimai/context-engineering (MIT), 2025. Source of the attractor-based neural-field model at the core of Elume's memory layer — specifically
00_foundations/08_neural_fields_foundations.md,00_foundations/11_emergence_and_attractor_dynamics.md,40_reference/attractor_dynamics.md, and the memory-attractor protocol shells in60_protocols/shells/. - Hopfield-style associative memory — Hopfield (PNAS 1982); textbook synthesis from Anderson (2014, Ch. 13); capacity bound from Amit, Gutfreund & Sompolinsky (1985). Classical mathematical substrate for discrete pattern storage inside the basin subsystem.
- Source codebase — dionysus3, a research cognitive architecture. Every module in
elume/was originally developed there. Elume relocates the kernel math with verbatim semantics and strips project-specific glue so the result is a pure library. The Shannon-entropy + information-gain mechanism insrc/elume/cognition/curiosity.pyis ported from dionysus3'sCuriosityDriveService(api/services/mosaeic_self_discovery.pyandarousal_system_service.py).
BibTeX entries for all upstream academic citations are in CITATIONS.bib. Please cite the upstream sources in any published work that uses Elume.
Status
Elume is an open-source integration project under active development.
Twenty-five tracks landed: kernel bootstrap, core data models, LinOSS solver + timing, Hopfield network, basin field engine, attractor basin core, embedder protocol, provider contracts, the evolution engine, the self-modeling network engine, immutable cognitive record types, immutable mental-model domain records, immutable metacognitive control records, prior hierarchy records, mental-model subnetworks, the cognitive event protocol, cognitive-event embedders, immutable thought-level records, immutable neuronal-packet records, deterministic thought competition, prior-gated cognition, the MemEvolve cartridge, curiosity homing device, and hyperevolution wiring. Track 007 was retired after source review showed it was framed against the wrong dionysus3 concept. 1177 tests passing, ruff clean.
Phase 2 is complete through the prior gate: Track 011 shipped elume.network, Tracks 014, 016, 018, 021, and 022 landed the minimal cognition gate from MentalModel through LinOSSEncoder, Tracks 012, 013, and 019 landed immutable thought and packet records plus deterministic EFE competition, and Tracks 015, 017, and 020 landed metacognitive control, generic priors, and prior-gated cognition. See conductor/tracks.md.
Phase 3 is complete: the MemEvolve cartridge (elume.adapters.memevolve), curiosity homing (elume.cognition.curiosity), and hyperevolution wiring now connect Elume's deterministic substrate to MemEvolve's outer evolutionary loop.
Archon-style deterministic-harness adoption is complete for v0.1.0. The kernel has injected RNGs, frozen trajectory metadata, provider snapshots, and an elume.envelope v0 operation registry covering belief embedding, evolution step, thought competition, self-model stepping, Hopfield recall, and (v0.2.0) curiosity scoring. Cross-platform float-hash policy is documented in docs/archon-readiness/21-float-hash-policy.md.
Install
Requires Python >=3.11.
pip install elume
Quickstart (development)
For local development, use uv and an editable install:
# from the repo root
uv venv .venv
uv pip install -e ".[dev]"
# run the test suite
.venv/bin/pytest
# lint
.venv/bin/ruff check src tests reference_service/src
# optional: reference service demo
uv pip install -e ./reference_service
PYTHONPATH=src:reference_service/src python -m reference_service
Layout
elume/
├── src/
│ └── elume/
│ ├── basins/ # Hopfield + basin field dynamics (neural fields model)
│ ├── cognition/ # mental-model subnetworks + typed cognitive events
│ ├── embedders/ # event -> trajectory projection protocols
│ ├── linoss/ # oscillatory state-space primitives (solver, timing, encoder)
│ ├── network/ # self-modeling network substrate for Phase 2 cognition
│ ├── evolution/ # successor-based strategy evolution
│ ├── providers/ # storage contracts + reference provider
│ ├── envelope/ # deterministic replay envelope + reference ops
│ ├── adapters/ # provider adapters (memevolve cartridge)
└── models/ # beliefs, strategies, trajectories, cognitive + thought records
├── reference_service/ # runnable CLI/FastAPI demo (separate package, optional)
├── tests/
│ ├── unit/ # unit tests for kernel modules
│ ├── contract/ # contract tests consumers re-run against their impls
│ └── integration/ # end-to-end composition tests across subsystems
└── conductor/ # spec-driven development docs and tracks
Consuming Elume
Downstream projects pin a versioned PyPI release:
pip install elume==0.1.0
For co-development against an unreleased branch, an editable install also works:
# from the consumer repo (e.g. dionysus3)
pip install -e /path/to/elume
Principles
- Integration, not invention. The underlying techniques are open source or openly published; Elume's work is bringing them together.
- Kernel, not application. Reusable mechanism only. Adapters and policies live in consumers.
- No framework lock-in. No FastAPI, no Graphiti, no agent runtime in the core.
- Pluggable storage. Providers are contracts, not implementations.
- Reproducible. Deterministic where possible; evolution randomness goes through an injectable RNG.
- Contract tests as the regression net. Consumers re-run
tests/contract/against their provider implementations. - The past is frozen. Trajectory records, belief snapshots, and basin activations are immutable. Strategies evolve by producing successors, not by mutating in place.
On the name
Elume is the brand form. ELUME works as an acronym mnemonic — Evolving, Long-horizon, Unified, Memory, Engine.
For public descriptors:
- Short: Agentic Memory Engine
- Technical long form: Long-Horizon Adaptive Memory Engine
- Tagline: An open-source agentic memory engine for long-horizon adaptive learning.
License
MIT. Compatible with Context-Engineering's MIT license and all upstream components.
See ATTRIBUTION.md and conductor/product.md for the full attribution and product specifications.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file elume-0.2.0.tar.gz.
File metadata
- Download URL: elume-0.2.0.tar.gz
- Upload date:
- Size: 324.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
45e4c83e9d25c89ededb80922402252f7ed92bcb47a819f8c2ba1561b2632060
|
|
| MD5 |
7a33b08ac4be8822f09d3fe3515b4678
|
|
| BLAKE2b-256 |
23176f24f51c0c4231d6e0fd5b360c39bb4e81295668673101fe228db28d4fd8
|
Provenance
The following attestation bundles were made for elume-0.2.0.tar.gz:
Publisher:
publish.yml on bionicbutterfly13/elume
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
elume-0.2.0.tar.gz -
Subject digest:
45e4c83e9d25c89ededb80922402252f7ed92bcb47a819f8c2ba1561b2632060 - Sigstore transparency entry: 1440288709
- Sigstore integration time:
-
Permalink:
bionicbutterfly13/elume@09d0f59465d91bc641e7f870afba4c3a70b93596 -
Branch / Tag:
refs/tags/v0.2.0 - Owner: https://github.com/bionicbutterfly13
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@09d0f59465d91bc641e7f870afba4c3a70b93596 -
Trigger Event:
release
-
Statement type:
File details
Details for the file elume-0.2.0-py3-none-any.whl.
File metadata
- Download URL: elume-0.2.0-py3-none-any.whl
- Upload date:
- Size: 132.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
1869f08a558e14267c89ef3c5a3d7edf5f5b7ffe4513451fb8557f78f7e77e00
|
|
| MD5 |
a2491bd65ff1083040540d9c27592f99
|
|
| BLAKE2b-256 |
c06b509a6c405cec676e4bb2994349c417754fe566007d0eee9098061b8b1062
|
Provenance
The following attestation bundles were made for elume-0.2.0-py3-none-any.whl:
Publisher:
publish.yml on bionicbutterfly13/elume
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
elume-0.2.0-py3-none-any.whl -
Subject digest:
1869f08a558e14267c89ef3c5a3d7edf5f5b7ffe4513451fb8557f78f7e77e00 - Sigstore transparency entry: 1440288735
- Sigstore integration time:
-
Permalink:
bionicbutterfly13/elume@09d0f59465d91bc641e7f870afba4c3a70b93596 -
Branch / Tag:
refs/tags/v0.2.0 - Owner: https://github.com/bionicbutterfly13
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@09d0f59465d91bc641e7f870afba4c3a70b93596 -
Trigger Event:
release
-
Statement type: