Skip to main content

Elume — an open-source agentic memory engine for long-horizon adaptive learning. An integration layer bringing together LinOSS oscillatory state-space models, attractor-based associative memory, the MemEvolve cartridge (ElumeMemoryProvider), and curiosity homing (shannon-entropy information-gain steering) into a single unified memory kernel. The contribution is integration, not invention.

Project description

Elume

PyPI version Python versions CI License: MIT

An open-source agentic memory engine for long-horizon adaptive learning.

Elume brings together existing memory and sequence-modeling components into a single working system for long-horizon agents.

It integrates LinOSS-style long-horizon temporal encoding (Rusch & Rus, ICLR 2025), attractor-based associative memory, and a deterministic adaptive memory substrate into one open-source stack. The contribution of Elume is not the invention of these underlying methods in isolation, but the engineering work required to combine them, adapt their codepaths, and make them operate coherently in a unified memory system.

What Elume is

Elume is an integration layer and runtime memory stack for agents that need to:

  • encode long trajectories,
  • recover useful prior state through associative recall,
  • and adapt memory behavior over time.

In practice, Elume packages and stabilizes multiple upstream ideas and implementations so they can be used together as a single agent memory engine.

What Elume is not

Elume does not claim authorship of the original LinOSS, MemEvolve, or Hopfield-style memory ideas.

Instead, it is an open-source composition of these components, with the modifications, interfaces, and system-level fixes needed to make them work together in one usable framework.

Elume's evolution module is a deterministic, replay-safe genetic algorithm operating on immutable Strategy records through a provider boundary. The framing — agent memory as an evolvable population rather than policy weights — is adopted from MemEvolve (Zhang et al. 2025, arXiv:2512.18746). The implementation is original Elume work using standard GA primitives. What Elume contributes is the engineering substrate: byte-identical replay, immutable lineage, provider abstraction, and composable operator protocols.

What Elume created

Things that did not exist anywhere before this project:

  • The deterministic envelope (elume.envelope, v0.1) — a canonical pre-image (BLAKE2b-256) over operation inputs, RNG state, result, and provider snapshot, giving every cognitive op a byte-identical replay contract. Five reference operations registered today (belief embed, basin recall, thought competition, evolution step, self-model step).
  • The platform-tagged float-hash policyplatform_fingerprint() folded into the canonical pre-image so cross-platform replay drift surfaces as a hash mismatch by construction, not silent agreement on incidentally-matching bytes.
  • elume.adapters.memevolve.ElumeMemoryProvider (v0.2) — the first deterministic baseline in MemEvolve's --memory_provider list. Same seed, same input → byte-identical MemoryResponse per step.
  • cognition.curiosity_score as a replayable envelope op (v0.2) — Shannon-entropy + information-gain scoring wrapped with the same hash-equal replay contract as every other op. The math is ported from dionysus3 (credited below); the envelope wrap, the integration with run_gated_thought_competition, and the CuriosityPrior derivation are original Elume work.
  • Hyperevolution coupling (v0.2) — the wiring inside ElumeMemoryProvider that lets curiosity continuously re-acquire the search heading: provide_memory re-ranks basins by current information gain; take_in_memory updates a per-session BeliefBuffer from trajectory outcomes; the whole pattern toggles via one config key.
  • The kernel discipline — frozen records, successor semantics (.evolved(), .revised(), .with_status()), injected RNG, provider-boundary persistence, no framework dependencies. This discipline applied uniformly across LinOSS, basins, evolution, cognition, embedders, and providers is what makes the whole stack composable inside one Python package.

What Elume adopted from upstream is named in the Attribution section. What Elume created is everything in the bullets above.

Core composition

Elume combines:

  1. LinOSS-based temporal encoding for long-horizon trajectory representation.
  2. Attractor-based associative memory for content-addressable recall.
  3. Deterministic adaptive memory logic for improving memory behavior over time, with an optional curiosity homing signal.

These components are integrated into a shared memory pipeline for agentic learning.

Why Elume

  • Determinism — injected RNG, byte-identical replay within a platform fingerprint. Every retrieval decision can be audited.
  • Immutable records — frozen trajectory snapshots, belief states, and basin activations. Strategies evolve via successors, not mutation.
  • Provider boundary — storage is a protocol contract, not an implementation. Swap backends without touching cognition code.
  • No framework lock-in — no FastAPI, Graphiti, or agent runtime in the core. Adapters live in consumers.
  • Cross-platform float-hash policyplatform_fingerprint() is folded into the canonical hash pre-image. Cross-platform drift is a visible mismatch, not silent corruption.
  • Curiosity-driven hyperevolution — the optional curiosity homing signal biases memory retrieval toward entropy-reducing directions, turning uniform-random search into goal-directed exploration.

MemEvolve cartridge

Elume v0.2.0 ships a BaseMemoryProvider-conformant adapter so MemEvolve (bingreeky/MemEvolve) can benchmark Elume against its 11 existing baselines. Two-line registration, then:

python run_flash_searcher_mm_gaia.py --memory_provider elume --sample_num 5

See docs/adapters/memevolve.md for the full install guide, determinism guarantee, and hyperevolution mode.

Why Elume exists

Many memory systems are strong in isolation but difficult to combine in practice.

Elume exists to make these components interoperable: to unify their interfaces, reconcile assumptions, patch incompatibilities, and provide a coherent open-source implementation that others can inspect, use, and build on.

Attribution

Elume builds directly on upstream work and code associated with LinOSS, MemEvolve, Hopfield-style associative memory, and attractor / neural-field context-engineering ideas.

Specific upstream sources:

  • LinOSS — Oscillatory State-Space Models — T. Konstantin Rusch and Daniela Rus, International Conference on Learning Representations (ICLR), 2025. Temporal encoding substrate and oscillator dynamics inside the basin field.
  • MemEvolve — Meta-Evolution of Agent Memory Systems — Guibin Zhang, Haotian Ren, Chong Zhan, Zhenhong Zhou, Junhao Wang, He Zhu, Wangchunshu Zhou, and Shuicheng Yan, arXiv preprint 2512.18746, 2025. Source of the evolvable-memory-population framing. The BaseMemoryProvider cartridge interface and shaping helpers in src/elume/adapters/memevolve/shaping.py are adapted from the bingreeky/MemEvolve codebase (Apache-2.0), with HTTP/HMAC stripped.
  • Context Engineering: Beyond Prompt Engineering — Context Engineering Contributors (maintained by David Kimai), github.com/davidkimai/context-engineering (MIT), 2025. Source of the attractor-based neural-field model at the core of Elume's memory layer — specifically 00_foundations/08_neural_fields_foundations.md, 00_foundations/11_emergence_and_attractor_dynamics.md, 40_reference/attractor_dynamics.md, and the memory-attractor protocol shells in 60_protocols/shells/.
  • Hopfield-style associative memory — Hopfield (PNAS 1982); textbook synthesis from Anderson (2014, Ch. 13); capacity bound from Amit, Gutfreund & Sompolinsky (1985). Classical mathematical substrate for discrete pattern storage inside the basin subsystem.
  • Source codebasedionysus3, a research cognitive architecture. Every module in elume/ was originally developed there. Elume relocates the kernel math with verbatim semantics and strips project-specific glue so the result is a pure library. The Shannon-entropy + information-gain mechanism in src/elume/cognition/curiosity.py is ported from dionysus3's CuriosityDriveService (api/services/mosaeic_self_discovery.py and arousal_system_service.py).

BibTeX entries for all upstream academic citations are in CITATIONS.bib. Please cite the upstream sources in any published work that uses Elume.

Status

Elume is an open-source integration project under active development.

Twenty-five tracks landed: kernel bootstrap, core data models, LinOSS solver + timing, Hopfield network, basin field engine, attractor basin core, embedder protocol, provider contracts, the evolution engine, the self-modeling network engine, immutable cognitive record types, immutable mental-model domain records, immutable metacognitive control records, prior hierarchy records, mental-model subnetworks, the cognitive event protocol, cognitive-event embedders, immutable thought-level records, immutable neuronal-packet records, deterministic thought competition, prior-gated cognition, the MemEvolve cartridge, curiosity homing device, and hyperevolution wiring. Track 007 was retired after source review showed it was framed against the wrong dionysus3 concept. 1177 tests passing, ruff clean.

Phase 2 is complete through the prior gate: Track 011 shipped elume.network, Tracks 014, 016, 018, 021, and 022 landed the minimal cognition gate from MentalModel through LinOSSEncoder, Tracks 012, 013, and 019 landed immutable thought and packet records plus deterministic EFE competition, and Tracks 015, 017, and 020 landed metacognitive control, generic priors, and prior-gated cognition. See conductor/tracks.md.

Phase 3 is complete: the MemEvolve cartridge (elume.adapters.memevolve), curiosity homing (elume.cognition.curiosity), and hyperevolution wiring now connect Elume's deterministic substrate to MemEvolve's outer evolutionary loop.

Archon-style deterministic-harness adoption is complete for v0.1.0. The kernel has injected RNGs, frozen trajectory metadata, provider snapshots, and an elume.envelope v0 operation registry covering belief embedding, evolution step, thought competition, self-model stepping, Hopfield recall, and (v0.2.0) curiosity scoring. Cross-platform float-hash policy is documented in docs/archon-readiness/21-float-hash-policy.md.

Install

Requires Python >=3.11.

pip install elume

Quickstart (development)

For local development, use uv and an editable install:

# from the repo root
uv venv .venv
uv pip install -e ".[dev]"

# run the test suite
.venv/bin/pytest

# lint
.venv/bin/ruff check src tests reference_service/src

# optional: reference service demo
uv pip install -e ./reference_service
PYTHONPATH=src:reference_service/src python -m reference_service

Layout

elume/
├── src/
│   └── elume/
│       ├── basins/      # Hopfield + basin field dynamics (neural fields model)
│       ├── cognition/   # mental-model subnetworks + typed cognitive events
│       ├── embedders/   # event -> trajectory projection protocols
│       ├── linoss/      # oscillatory state-space primitives (solver, timing, encoder)
│       ├── network/     # self-modeling network substrate for Phase 2 cognition
│       ├── evolution/   # successor-based strategy evolution
│       ├── providers/   # storage contracts + reference provider
│       ├── envelope/    # deterministic replay envelope + reference ops
│       ├── adapters/    # provider adapters (memevolve cartridge)
        └── models/      # beliefs, strategies, trajectories, cognitive + thought records
├── reference_service/   # runnable CLI/FastAPI demo (separate package, optional)
├── tests/
│   ├── unit/            # unit tests for kernel modules
│   ├── contract/        # contract tests consumers re-run against their impls
│   └── integration/     # end-to-end composition tests across subsystems
└── conductor/           # spec-driven development docs and tracks

Consuming Elume

Downstream projects pin a versioned PyPI release:

pip install elume==0.1.0

For co-development against an unreleased branch, an editable install also works:

# from the consumer repo (e.g. dionysus3)
pip install -e /path/to/elume

Principles

  • Integration, not invention. The underlying techniques are open source or openly published; Elume's work is bringing them together.
  • Kernel, not application. Reusable mechanism only. Adapters and policies live in consumers.
  • No framework lock-in. No FastAPI, no Graphiti, no agent runtime in the core.
  • Pluggable storage. Providers are contracts, not implementations.
  • Reproducible. Deterministic where possible; evolution randomness goes through an injectable RNG.
  • Contract tests as the regression net. Consumers re-run tests/contract/ against their provider implementations.
  • The past is frozen. Trajectory records, belief snapshots, and basin activations are immutable. Strategies evolve by producing successors, not by mutating in place.

On the name

Elume is the brand form. ELUME works as an acronym mnemonic — Evolving, Long-horizon, Unified, Memory, Engine.

For public descriptors:

  • Short: Agentic Memory Engine
  • Technical long form: Long-Horizon Adaptive Memory Engine
  • Tagline: An open-source agentic memory engine for long-horizon adaptive learning.

License

MIT. Compatible with Context-Engineering's MIT license and all upstream components.

See ATTRIBUTION.md and conductor/product.md for the full attribution and product specifications.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

elume-0.2.0.tar.gz (324.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

elume-0.2.0-py3-none-any.whl (132.2 kB view details)

Uploaded Python 3

File details

Details for the file elume-0.2.0.tar.gz.

File metadata

  • Download URL: elume-0.2.0.tar.gz
  • Upload date:
  • Size: 324.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for elume-0.2.0.tar.gz
Algorithm Hash digest
SHA256 45e4c83e9d25c89ededb80922402252f7ed92bcb47a819f8c2ba1561b2632060
MD5 7a33b08ac4be8822f09d3fe3515b4678
BLAKE2b-256 23176f24f51c0c4231d6e0fd5b360c39bb4e81295668673101fe228db28d4fd8

See more details on using hashes here.

Provenance

The following attestation bundles were made for elume-0.2.0.tar.gz:

Publisher: publish.yml on bionicbutterfly13/elume

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file elume-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: elume-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 132.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for elume-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 1869f08a558e14267c89ef3c5a3d7edf5f5b7ffe4513451fb8557f78f7e77e00
MD5 a2491bd65ff1083040540d9c27592f99
BLAKE2b-256 c06b509a6c405cec676e4bb2994349c417754fe566007d0eee9098061b8b1062

See more details on using hashes here.

Provenance

The following attestation bundles were made for elume-0.2.0-py3-none-any.whl:

Publisher: publish.yml on bionicbutterfly13/elume

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page