Event Sourcing based belief formation system for long-term AI agent memory
Project description
Bubble
Event Sourcing based belief formation system for long-term AI agent memory paper Discord
How it works
[ raw input ]
│
┌─────▼─────┐
│ decompose │
└─────┬─────┘
│
┌────────────────┴────────────────┐
ι ≥ θ ι < θ
│ │
vivid signal weak signal
│ │
┌────▼────┐ ┌──────▼──────┐
│ archive │ │ pool │
└────┬────┘ │ · · · · · │
│ │ · · · · · │
│ │ · · · · · │
│ └──────┬──────┘
│ │
│ enough gathered?
│ │
│ no ──────────┘
│ │ yes
│ ┌──────▼──────┐
│ │ cluster │
│ │ + score │
│ └──────┬──────┘
│ │
└──────────────┬──────────────────┘
│
┌─────▼─────┐
│ episode │ immutable
└─────┬─────┘
│
same topic chain?(NLI)
yes │ │ no
│ │
┌──────────▼─┐ ┌▼────────────┐
joins chain │ ... ──► e │ │ e │ new chain
└──────────┬─┘ └─────┬───────┘
│ │
┌─────▼───────────▼─────┐
│ snapshot │
│ centroid │ summary │
│ (eager) │ (lazy) │
└───────────┬───────────┘
│
[ retrieve ]
│
┌────────────────┴─────────────────┐
default verbose
│ │
snapshot summary with episode chain + labels
Setup
1.run Falkordb
docker run -e REDIS_ARGS="--appendonly yes --appendfsync everysec" -v <PATH>:/var/lib/falkordb/data -p 3000:3000 -p 6379:6379 -d --name falkordb falkordb/falkordb
2.embedding model(Matryoshka)
note: command below is cpu version
docker run --name tei-embedding -d -p 8997:80 -v <PATH>:/data --pull always ghcr.io/huggingface/text-embeddings-inference:cpu-latest --model-id nomic-ai/nomic-embed-text-v1.5
Or embedding cloud api
3.NLI model(Optional, but recommended, saves some LLM calls)
note: command below is cpu version
docker run --name tei-nli -d -p 8999:80 -v <PATH>:/data --pull always ghcr.io/huggingface/text-embeddings-inference:cpu-latest --model-id cross-encoder/nli-deberta-v3-small
Necessary configuration in your .env file
ANTHROPIC_API_KEY=
FALKORDB_HOST=localhost
FALKORDB_PORT=6379
BUBBLE_EMBED_DIM=768
BUBBLE_EMBED_ENDPOINT=http://localhost:8997/v1/embeddings
#If you have NLI setup
BUBBLE_ENABLE_NLI=true
BUBBLE_NLI_ENDPOINT=http://localhost:8999/predict
How to use (extremely easy and clean)
ingest
import bubble
bubble.process(user_id, content, prior)
prior: the context of the content, for example prior messages
retrieve
import bubble
memory_user = await bubble.retrieve(user_id, query)
Replayability
Memory episodes are archived in <project root>/data/archive as jsonl
You can reconstruct your whole memory graph by a single command ! WITHOUT A SINGLE LLM CALL !
python -m bubble.main replay <user_id>
Tuning/Customization
See .env.example for ALL tunable arguments.
Limitations
Bubble is currently an experimental project for personal use.
Current promotion formula, env vars might not be the best.
prompts might have much room to improve. Patch bubble.decomposer._SYSTEM if it doesn't fit your use case.
Leave a star if you like this work. Contributions are welcome.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file bubble_memory-0.1.0.tar.gz.
File metadata
- Download URL: bubble_memory-0.1.0.tar.gz
- Upload date:
- Size: 22.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
3b51a32889584e94188b60aa1965c2ebc4e3dccb3c4a6d4ba1e55d802cf6a4d1
|
|
| MD5 |
2e2bf0ee59a7f85e046b2c08669d990e
|
|
| BLAKE2b-256 |
ee263aa9195b0d336b4ac920a73ae038b46994f37e8072a3300369e2405a02a5
|
File details
Details for the file bubble_memory-0.1.0-py3-none-any.whl.
File metadata
- Download URL: bubble_memory-0.1.0-py3-none-any.whl
- Upload date:
- Size: 25.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a27e2149231fd6af306aa744f5fefe4f7af32e24e174d139b38f5944633b02f6
|
|
| MD5 |
e1cb0d1493c02835be88bf35bf9a7018
|
|
| BLAKE2b-256 |
e81b43efec3ac4ce3e51e74d6898f3440d9859ba394c48125ab95ea53cdfdc8f
|