PostgreSQL + pgvector graph store backend for Mem0 — powered by dbay.cloud
Project description
mem0-dbay
Drop Neo4j. Use one database for everything.
mem0-dbay is a graph store plugin for Mem0 that replaces Neo4j with dbay.cloud — a Serverless PostgreSQL platform with pgvector built-in. Your vector store and graph store run in one database, zero infrastructure to manage.
Why?
Self-hosting Mem0 requires 3 separate databases:
Mem0 → PostgreSQL + pgvector (vector store)
→ Neo4j (graph store) ← extra infra
→ SQLite (history)
Neo4j adds real pain:
- Another database to deploy, monitor, backup, and pay for
- 90-second cold start in Docker
- Completely different query language (Cypher)
- APOC plugin dependency
With mem0-dbay, everything runs on one dbay.cloud database:
Mem0 → dbay.cloud PostgreSQL (vectors + graph + history)
One connection string. One database. Zero Cypher. Zero ops.
Quick start
1. Create a free database on dbay.cloud
Sign up at dbay.cloud and create a database. You'll get a connection string like:
postgres://user_xxx:password@pg.dbay.cloud:4432/my-mem0-db?sslmode=require&options=endpoint%3Dmy-mem0-db
pgvector is pre-installed. No extensions to enable, no tables to create — mem0-dbay handles everything automatically.
2. Install
pip install mem0-dbay
3. Use
import mem0_dbay
DBAY_URL = "postgres://user_xxx:password@pg.dbay.cloud:4432/my-mem0-db?sslmode=require&options=endpoint%3Dmy-mem0-db"
m = mem0_dbay.create_memory({
"graph_store": {
"provider": "dbay",
"config": {"connection_string": DBAY_URL, "embedding_dimension": 1536}
},
"vector_store": {
"provider": "pgvector",
"config": {"connection_string": DBAY_URL, "embedding_model_dims": 1536}
},
"llm": {
"provider": "openai",
"config": {"api_key": "sk-..."}
},
"embedder": {
"provider": "openai",
"config": {"api_key": "sk-..."}
},
})
# Add memories — entities and relationships are auto-extracted
m.add("Alice works at Google as an engineer", user_id="alice")
m.add("Bob manages Alice at Google", user_id="alice")
# Search
results = m.search("Where does Alice work?", user_id="alice")
That's it. No Neo4j to install. No Docker Compose with 3 containers. Just pip install and a connection string.
Works with any OpenAI-compatible LLM
Not just OpenAI — use DeepSeek, SiliconFlow, or any OpenAI-compatible provider:
m = mem0_dbay.create_memory({
"graph_store": {
"provider": "dbay",
"config": {"connection_string": DBAY_URL, "embedding_dimension": 1024}
},
"vector_store": {
"provider": "pgvector",
"config": {"connection_string": DBAY_URL, "embedding_model_dims": 1024}
},
"llm": {
"provider": "openai",
"config": {
"api_key": "sk-...",
"openai_base_url": "https://api.siliconflow.cn/v1",
"model": "deepseek-ai/DeepSeek-V3",
}
},
"embedder": {
"provider": "openai",
"config": {
"api_key": "sk-...",
"openai_base_url": "https://api.siliconflow.cn/v1",
"model": "BAAI/bge-m3",
}
},
})
Why dbay.cloud?
dbay.cloud is a Serverless PostgreSQL platform purpose-built for AI workloads.
| Self-hosted PG + Neo4j | dbay.cloud | |
|---|---|---|
| Databases to manage | 2 (PG + Neo4j) | 0 — fully managed |
| Idle cost | 24/7 running (~$50+/mo) | Scale-to-zero — free when idle |
| Cold start | Neo4j: 90s | ~500ms warm / ~8s cold |
| Compute elasticity | Fixed size | 1cu–8cu on demand — scales with your workload |
| pgvector | Install yourself | Pre-installed, HNSW indexes ready |
| BM25 full-text search | Not available | pg_search (ParadeDB) pre-installed |
| Database branching | Not possible | Git-style copy-on-write branches |
| Setup time | Hours (Docker, configs, health checks) | 30 seconds |
Scale-to-zero: pay nothing when idle
Most AI memory databases sit idle 90%+ of the time. On dbay.cloud, idle databases automatically suspend and cost nothing. When your agent makes a query, the database wakes up in ~500ms. Perfect for per-user memory databases where most users aren't active simultaneously.
Git-style branching: A/B test your memory strategies
Want to test a new reflection strategy without risking your production memories?
main (production memories)
├── branch: experiment-a ← test new extraction prompts
└── branch: experiment-b ← test different similarity thresholds
Branches are instant copy-on-write — zero storage overhead. Compare results, keep the winner, delete the rest. No other memory infrastructure can do this.
Elastic compute: scale up for batch processing
Running a batch reflection job on thousands of memories? Scale up to 8cu (8 vCPU, 16 GB) for the job, then scale back to 1cu. No server resizing, no downtime.
How it works
Mem0's graph memory only does simple operations — no PageRank, no graph algorithms:
| What Mem0 does | Neo4j (Cypher) | mem0-dbay (SQL) |
|---|---|---|
| Store entities | :__Entity__ nodes |
graph_nodes table |
| Vector similarity | vector.similarity.cosine() |
pgvector <=> operator |
| Store relationships | MERGE (a)-[r:TYPE]->(b) |
INSERT INTO graph_edges |
| Find neighbors | MATCH (n)-[r]->(m) |
JOIN graph_edges ON ... |
Two PostgreSQL tables replace an entire Neo4j instance.
Also works with local PostgreSQL
If you prefer self-hosting, mem0-dbay works with any PostgreSQL that has pgvector:
"connection_string": "postgresql://user:pass@localhost:5432/mydb"
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file mem0_dbay-0.1.0.tar.gz.
File metadata
- Download URL: mem0_dbay-0.1.0.tar.gz
- Upload date:
- Size: 14.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
3a5cc3811aff7bc9ff36cbdfe01cbfd39acc7bb5e9de067b3bdf8cf91617b801
|
|
| MD5 |
f1764e4ab91d03a3c35b1f7479c204a8
|
|
| BLAKE2b-256 |
1d4bb20734583ea0e29e41357c0953ac7860e6406d0f458cd76de848b953512f
|
File details
Details for the file mem0_dbay-0.1.0-py3-none-any.whl.
File metadata
- Download URL: mem0_dbay-0.1.0-py3-none-any.whl
- Upload date:
- Size: 11.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
3fdf3dc3810b88bde9e007cc34ecd20496684e380778b8697e18b8d2d07210e3
|
|
| MD5 |
6d051201d279dcf6230deab064947ce5
|
|
| BLAKE2b-256 |
d35a0102f54756ffcf923bbef5551e76d279197db3a4a3a110565e9f07e1847d
|