Skip to main content

Event Sourcing based belief formation system for long-term AI agent memory

Project description

Bubble

A memory system lets your chatbot truly know you and maybe themselves.

Core Idea

  1. Intensity gating. Not everything is equally worth remembering. The more trivial the more accumulation needed in order to form a memory entry.
  2. Input decomposition. Input is decomposed into atomic segments that preserve their original meaning, only highly related segments will cluster to form episodes which provide high precision retrieval.
  3. Replayability. Human memory functions like an event-sourcing system, under the same experience, nearly identical personality would emerge again.
  4. Lightweight and minimal LLM calls.
    paper

How it works

[ raw input ]
                               │
                         ┌─────▼─────┐
                         │ decompose │
                         └─────┬─────┘
                               │
              ┌────────────────┴────────────────┐
          ι ≥ θ                             ι < θ
              │                                 │
        vivid signal                       weak signal
              │                                 │
         ┌────▼────┐                     ┌──────▼──────┐
         │ archive │                     │    pool     │
         └────┬────┘                     │  · · · · ·  │
              │                          │ ·  · ·  · · │
              │                          │  · · · ·  · │
              │                          └──────┬──────┘
              │                                 │
              │                          enough gathered?
              │                                 │
              │                    no ──────────┘
              │                                 │ yes
              │                          ┌──────▼──────┐
              │                          │   cluster   │
              │                          │   + score   │
              │                          └──────┬──────┘
              │                                 │
              └──────────────┬──────────────────┘
                             │
                       ┌─────▼─────┐
                       │  episode  │  immutable
                       └─────┬─────┘
                             │
                     same topic chain?(NLI)
                      yes │       │ no
                          │       │
               ┌──────────▼─┐   ┌▼────────────┐
   joins chain │ ... ──► e  │   │     e       │  new chain
               └──────────┬─┘   └─────┬───────┘
                          │           │
                    ┌─────▼───────────▼─────┐
                    │       snapshot        │
                    │  centroid  │  summary │ 
                    │  (eager)   │  (lazy)  │
                    └───────────┬───────────┘
                                │
                           [ retrieve ]
                                │
               ┌────────────────┴─────────────────┐
           default                            verbose
               │                                  │
         snapshot summary             with episode chain + labels

Setup

1.run Falkordb

docker run -e REDIS_ARGS="--appendonly yes --appendfsync everysec" -v <PATH>:/var/lib/falkordb/data -p 3000:3000 -p 6379:6379 -d --name falkordb falkordb/falkordb

2.embedding model(Matryoshka)

note: command below is cpu version

docker run --name tei-embedding -d -p 8997:80 -v <PATH>:/data --pull always ghcr.io/huggingface/text-embeddings-inference:cpu-latest --model-id nomic-ai/nomic-embed-text-v1.5

Or embedding cloud api

3.NLI model(Optional, but recommended, saves some LLM calls)

note: command below is cpu version

docker run --name tei-nli -d -p 8999:80 -v <PATH>:/data --pull always ghcr.io/huggingface/text-embeddings-inference:cpu-latest --model-id cross-encoder/nli-deberta-v3-small

Necessary configuration in your .env file

ANTHROPIC_API_KEY=
FALKORDB_HOST=localhost
FALKORDB_PORT=6379
BUBBLE_EMBED_DIM=768
BUBBLE_EMBED_ENDPOINT=http://localhost:8997/v1/embeddings

#If you have NLI setup
BUBBLE_ENABLE_NLI=true
BUBBLE_NLI_ENDPOINT=http://localhost:8999/predict

How to use (extremely easy and clean)

installation

pip install bubble-memory
or
uv add bubble-memory

ingest

import bubble
bubble.process(user_id, content, prior)

prior: the context of the content, for example prior messages

retrieve

import bubble
memory_user = await bubble.retrieve(user_id, query)

Replayability

Memory episodes are archived in <project root>/data/archive as jsonl
You can reconstruct your whole memory graph by a single command ! WITHOUT A SINGLE LLM CALL !

python -m bubble.main replay <user_id>

Tuning/Customization

See .env.example for ALL tunable arguments.

Limitations

Bubble is currently an experimental project for personal use.
Current promotion formula, tunable variables might not be the best.
prompts might have much room to improve. Patch bubble.decomposer._SYSTEM if it doesn't fit your use case.

Leave a star if you like this work. Contributions are welcome.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

bubble_memory-0.11.tar.gz (28.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

bubble_memory-0.11-py3-none-any.whl (33.5 kB view details)

Uploaded Python 3

File details

Details for the file bubble_memory-0.11.tar.gz.

File metadata

  • Download URL: bubble_memory-0.11.tar.gz
  • Upload date:
  • Size: 28.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.2

File hashes

Hashes for bubble_memory-0.11.tar.gz
Algorithm Hash digest
SHA256 36fb8f161fabd4bb6c4ea08372db1ea8bd4718632c332fa5f6ae8aa86713574b
MD5 ff02c418231e201891fa72d786424484
BLAKE2b-256 0815a8c03273eff28aeec3c3fb19f5e4a82abea51722bcd4785ab13e565a4a20

See more details on using hashes here.

File details

Details for the file bubble_memory-0.11-py3-none-any.whl.

File metadata

  • Download URL: bubble_memory-0.11-py3-none-any.whl
  • Upload date:
  • Size: 33.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.2

File hashes

Hashes for bubble_memory-0.11-py3-none-any.whl
Algorithm Hash digest
SHA256 48524d75448b5a3475db8374d315d2d2e4db8faa000dd9ab7bd43463dcddaba6
MD5 5d90d650b14780971c4cdc6183feea3a
BLAKE2b-256 37163d9306a77bc9547f8588a0bfe360e7fdec8669321a55b474c0b985fb235f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page