Single-file graph memory for local AI, agents, and Python applications
Project description
liel
Structured memory layer for local AI agents — a single .liel file for entities and relationships you can traverse, not a generic graph database or "only" a Python graph library. Rust delivers safe, fast on-disk storage; Python (PyO3) is the main integration surface for demos, scripts, and tools.
For people who use local AI agents while coding — and who care about long-term memory, explicit relationships, and reuse across sessions.
Runs without API keys (LLM optional). In about 30 seconds after pip install liel, run liel-demo or python -m liel.demo — the demo ships inside the wheel (no curl, no clone). Set LIEL_DEMO_LLM=1 if you want a local Ollama model to phrase the exploration list. Source also lives in python/liel/demo.py; the repo wraps it as examples/08_demo.py.
This is not just storage — it changes what your agent remembers and suggests.
liel is best understood as a portable external brain for relationship-centric AI workflows, not a general-purpose or server-style graph platform. The same data can power a quick script, a bundled demo, or an MCP tool reading one .liel file.
Honest scope: it is a narrow, local durability model (see reliability) — not a distributed system and not a substitute for a full production database in every use case.
Entry points (what to open first): primary — liel-demo and liel[mcp] for tools like Claude; also — the Python API for application code, inspection, and automation. (Details in Quickstart.)
The core package has no runtime dependencies. No external database server, cloud service, or background daemon is required. On supported platforms, pip install liel is enough to get started.
MCP integration is optional. Install liel[mcp] only when you want to expose a .liel memory file to an MCP-capable AI tool.
The storage engine is a small Rust property graph core with optional MCP on top. If SQLite is the one-file relational store, liel is the one-file external-brain shape for local agents who think in relationships, not just rows or chunks. It is not a cluster graph server; it is a minimal substrate for agent memory you can ship in a file.
Etymology
Derived from the French lier (to connect / to bind).
The name reflects our core focus: transforming linear chat logs into a connected graph of knowledge.
The Zen of Liel
- One file, any place.
- No server, no waiting.
- Minimal dependencies, simple environments.
- Start small, stay local.
See Design principles.
Table of contents
- Quickstart
- Install
- Architecture
- What It Is
- How this differs from RAG
- When liel fits
- When liel does not fit
- Design trade-offs
- Documentation
- Contributing
- License
Quickstart
Start here (primary first): run liel-demo (or python -m liel.demo) to see output in seconds; add liel[mcp] when you want the same file as an MCP memory layer. Use the Python API when you are wiring liel into code or one-off tools — sections below follow that order.
1. See what this actually remembers (in ~10 seconds) (LLM optional)
Example output from liel-demo (structured like agent memory, not a raw graph dump):
Running liel demo (no API key required)
=== Agent Memory Demo ===
[Input]
User: I like coffee in Silicon Valley. (simple example)
[Memory Stored]
-> Found preference: Coffee
-> Context: Silicon Valley
You --LIKES--> coffee
You --IN----> Silicon Valley
coffee --SUGGESTS--> espresso, single-origin beans, Palo Alto cafes
Silicon Valley --SUGGESTS--> Palo Alto cafes
[Graph Inputs]
preference: coffee
place: Silicon Valley
SUGGESTS overlap (preference & place): Palo Alto cafes
[Graph Traversal]
db.neighbors(coffee, edge_label='SUGGESTS')
-> espresso
-> single-origin beans
-> Palo Alto cafes
db.neighbors(place, edge_label='SUGGESTS')
-> Palo Alto cafes
ranked: shared suggestions first, then coffee-only suggestions
[Graph-based Suggestion]
-> Since you like coffee in Silicon Valley, you might explore: Palo Alto cafes, espresso, single-origin beans.
why: combines place + preference via SUGGESTS overlap, not a flat chat log
[Next Prompt Context]
-> User prefers coffee in Silicon Valley - bias future replies toward graph-linked topics: Palo Alto cafes, espresso, single-origin beans.
-> Exploration list: Palo Alto cafes, espresso, single-origin beans
[graph-derived from SUGGESTS (coffee and place, overlap first) - no LLM called]
-> (Paste into your agent prompt when you want this memory to steer the reply.)
Demo completed successfully
Fastest path (bundled in the package):
pip install liel
# option 1 — console script (requires install scripts on PATH, e.g. venv `Scripts/` or pipx)
liel-demo
Always works with the interpreter you used for pip (no reliance on PATH scripts):
pip install liel
# option 2 — same as above, most portable
python -m liel.demo
The rest of examples/ (01, 07, notebooks, …) stays GitHub-only to keep the wheel light. From a checkout you can still run python examples/08_demo.py (thin wrapper). To download only the module source: curl -sS -O https://raw.githubusercontent.com/hy-token/liel/main/python/liel/demo.py.
Optional local LLM for the exploration list only: start Ollama, then LIEL_DEMO_LLM=1 liel-demo (override OLLAMA_HOST / OLLAMA_MODEL if needed). If Ollama is down, the command still finishes using a built-in fallback.
Quick limits: one writer per .liel file; no semantic/vector search in core; commit() defines crash-safe boundaries. Details: Reliability and failure model.
2. LLM memory in one file
Install the core package:
pip install liel
Instead of losing context between sessions, store decisions and relationships as a graph:
import liel
with liel.open("agent-memory.liel") as db:
task = db.add_node(["Task"], name="Design AI memory system")
decision = db.add_node(
["Decision"],
content="Use graph memory instead of text-only retrieval",
)
source = db.add_node(["Source"], title="Architecture notes")
db.add_edge(task, "LED_TO", decision)
db.add_edge(decision, "SUPPORTED_BY", source)
db.commit()
for node in db.neighbors(task, edge_label="LED_TO"):
print(node["content"])
Now your AI can recall why decisions were made, not just what was said.
3. Python property graph
For a minimal graph API example:
import liel
with liel.open(":memory:") as db:
alice = db.add_node(["Person"], name="Alice")
bob = db.add_node(["Person"], name="Bob")
db.add_edge(alice, "KNOWS", bob, since=2020)
db.commit()
print(db.neighbors(alice, edge_label="KNOWS")[0]["name"]) # Bob
For the Python API, transactions, QueryBuilder, traversal, and examples:
4. Claude + MCP project memory
liel[mcp] exposes one official MCP surface for AI memory:
liel_overviewliel_findliel_exploreliel_traceliel_mapliel_appendliel_merge
Step 1 - Install
pip install "liel[mcp]"
Step 2 - Register the MCP server
Configuration file locations, approval flow, and setup UX differ between MCP
clients and change over time. Follow your client's official MCP setup guide,
then register liel with a server definition like this:
{
"mcpServers": {
"liel": {
"command": "/path/to/python",
"args": [
"-m",
"liel.mcp",
"--path",
"/path/to/your/project.liel"
]
}
}
}
Replace:
commandwith the Python executable whereliel[mcp]is installed--pathwith the.lielfile you want the AI tool to use as durable memory
After registering the server, restart your MCP client and confirm that liel
appears as connected in its MCP management UI.
Step 3 - Tell Claude to use it
Add this block to your project's CLAUDE.md:
## Project Memory
- Use `liel[mcp]` as the long-term memory store for this project when the MCP server is available.
- At the start of a task, use `liel_overview`, then `liel_find`, then `liel_explore` to restore context before asking the user to repeat it.
- Save only durable information: confirmed decisions, stable preferences, important facts, open questions, and tasks that should survive the session.
- Do not save temporary reasoning, speculative ideas, or verbose logs.
- Reuse existing nodes and link new ones to related nodes. Avoid duplicates.
- Use `liel_append` when you want guaranteed new records, and `liel_merge` when you want to reuse existing nodes or idempotent edges.
- Write at meaningful checkpoints (task complete, decision confirmed, session end) - not on every turn.
That's it. Claude will now read and write agent-memory.liel autonomously.
For the full setup guide, available tools, and a longer CLAUDE.md example:
Install
Install the dependency-free core package:
pip install liel
Install the optional MCP integration only when you want an MCP-capable AI tool to use a .liel file as external memory:
pip install "liel[mcp]"
Platform support
- OS: Linux, macOS, Windows
- Architecture: x86_64 first, arm64 where practical
- Python: 3.9 or newer
This installs prebuilt wheels for supported platforms. Rust is not required at install time.
If you are contributing or need a source build, see:
Architecture
Agent memory layer in the diagram is the role that liel provides (structured, relationship-centric memory). It is not a separate product sitting above liel — the two labels describe one stack.
The LLM is optional; the graph is the durable story.
flowchart TB
agent["Layer 1: Local AI Agent\nplanner, reasoning, tool use"]
mem["Layer 2: Agent memory layer (concept)\nentities + relations"]
subgraph L [Layer 3: liel - three peer surfaces, one engine]
direction LR
rust["Rust core\ngraph storage, traversal, persistence"]
py["Python (PyO3)\ninspection, scripting, integration"]
primary["Primary interface\nliel-demo, MCP tools"]
end
store["Layer 4: single-file graph store: .liel"]
agent -->|memory operations| mem
mem --> L
rust --> store
py --> store
primary --> store
If your Markdown viewer does not render the diagram, the same idea in plain text: agent → memory (entities + relations) → liel (Rust + Python + primary surfaces) → one .liel file.
What It Is
liel is a single-file external-brain substrate for local memory and relationship-centric AI workflows.
It is built around a few deliberate choices:
- one portable
.lielfile instead of a server - explicit graph relationships instead of text-only memory
- local persistence instead of cloud-managed infrastructure
- a small Rust core with a Python-first interface
Internally this is implemented as a property graph, but the product promise is higher-level: durable, inspectable memory that an LLM or local agent can carry between sessions.
For the feature surface and file format:
How this differs from RAG
Why this exists (what it is not for alone): storing graphs in a relational database is clunky; vector / RAG-style systems emphasize similarity more than hand-authored relationships; a plain chat log has no graph structure. liel is for the case where you want named entities, edges, and traversal in one local file.
RAG retrieves similar text chunks. liel stores and traverses relationships between entities, decisions, tasks, sources, files, and tool results.
Use RAG when your main problem is finding relevant passages. Use liel when your AI tool needs durable memory that can answer relationship-centric questions like:
- Which decision led to this task?
- What source supported that claim?
- Which files, tool calls, and follow-up tasks are connected?
liel is not a retrieval system. It is a persistent memory substrate for local AI workflows.
When liel fits
Typical use case: durable agent or assistant memory — preferences, decisions, and relationships that should survive the session, with graph-shaped context you can hand to the next prompt or tool.
More generally, use liel when:
- you want local AI memory as a file, not a service
- relationships between entities matter
- you want decisions, facts, sources, tasks, and tool outputs to survive across sessions
- you want graph traversal without deploying a separate database server
- you want something easy to copy, back up, inspect, and archive
Common good fits:
- project memory for coding assistants
- local agent memory
- personal or team knowledge graphs
- MCP-backed memory for AI tools
- lightweight provenance-aware tool result stores
Examples and usage patterns:
When liel does not fit
liel is not the right tool when your primary need is:
- semantic similarity search over text
- full-text search or document retrieval as the main access pattern
- large-scale analytics or OLAP-style graph reporting as the main workload
- very large graph workloads with heavy concurrent writes
- server-style multi-user mutation
- SQL-centric graph querying over an existing relational system
In those cases, a vector database, search engine, PostgreSQL recursive queries, DuckDB graph extensions, or a server-backed graph database may fit better.
More detailed comparisons and non-goals:
Design trade-offs
liel is intentionally narrow. The main trade-offs are:
- single-writer design rather than concurrent peer-to-peer mutation
- page-level WAL for durability, not ultra-high-frequency tiny commits
- no full-text engine, query language, or property index in the current product shape
- Python API and MCP integration first, with the Rust core kept small
That narrowness comes from the product framing: liel is trying to be a portable external brain for local AI systems, not a general-purpose graph database platform.
This is what keeps the system simple and portable, but it also defines where it is and is not comfortable to use.
Read these before using liel as durable application state:
liel is currently a Beta package. The supported contract is the
Python-first API documented in the guide, plus the single-writer,
single-file reliability model documented in the reference. 0.x releases may
still make breaking changes, but changes to the documented Beta surface should
be called out in the changelog with migration notes.
The project is open source with CI on the default branch; local checks in Contributing mirror what runs in the repo. The demo has an Ollama fallback when a local model is not available so runs stay reproducible.
Feedback: try liel-demo, then open an issue if the memory model, MCP surface, or file format should evolve — especially what breaks in real agent workflows. 0.x is a good time to challenge assumptions; breaking API changes are fine if they are called out in the changelog.
Documentation
The PyPI source distribution is intentionally small and does not include the full documentation tree or example scripts. Use the GitHub repository for:
- Documentation index
- Why liel
- Python guide
- MCP guide
- Reference
- Design docs
- Examples
- Example notebooks
Contributing
Pull requests and issues are welcome. Good first step: run liel-demo and note anything confusing about output or the memory model. For bug reports, include OS, liel version, and a minimal way to reproduce.
Local checks:
cargo fmt
cargo clippy --all-targets --all-features -- -D warnings
cargo test
pytest tests/python/
License
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distributions
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file liel-0.2.8.tar.gz.
File metadata
- Download URL: liel-0.2.8.tar.gz
- Upload date:
- Size: 146.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
0253c9a1afec8c4fcbb5f652dcb18044650b07679b1ae77c336a70c60a047593
|
|
| MD5 |
88be101c5bb1de6512b05d08eee312ff
|
|
| BLAKE2b-256 |
4430fcddd1becffd4cc7910e8033a2237e8418a6f53ae7570781d154de660e98
|
Provenance
The following attestation bundles were made for liel-0.2.8.tar.gz:
Publisher:
release-pypi.yml on hy-token/liel
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
liel-0.2.8.tar.gz -
Subject digest:
0253c9a1afec8c4fcbb5f652dcb18044650b07679b1ae77c336a70c60a047593 - Sigstore transparency entry: 1391439308
- Sigstore integration time:
-
Permalink:
hy-token/liel@c9533ca6de1ccb8686fce30daee1482cfaf46aed -
Branch / Tag:
refs/tags/v0.2.8 - Owner: https://github.com/hy-token
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release-pypi.yml@c9533ca6de1ccb8686fce30daee1482cfaf46aed -
Trigger Event:
workflow_dispatch
-
Statement type:
File details
Details for the file liel-0.2.8-cp39-abi3-win_amd64.whl.
File metadata
- Download URL: liel-0.2.8-cp39-abi3-win_amd64.whl
- Upload date:
- Size: 358.8 kB
- Tags: CPython 3.9+, Windows x86-64
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
d2413a94d2aa96811eb573b3a264e522ce1a56e7b0ba9662703414559f269b0b
|
|
| MD5 |
51fe25df6b11675242f4616459988295
|
|
| BLAKE2b-256 |
054f27bfa2c46f47c72c29415f89f581febe79428245b9730149ee87847a7842
|
Provenance
The following attestation bundles were made for liel-0.2.8-cp39-abi3-win_amd64.whl:
Publisher:
release-pypi.yml on hy-token/liel
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
liel-0.2.8-cp39-abi3-win_amd64.whl -
Subject digest:
d2413a94d2aa96811eb573b3a264e522ce1a56e7b0ba9662703414559f269b0b - Sigstore transparency entry: 1391439322
- Sigstore integration time:
-
Permalink:
hy-token/liel@c9533ca6de1ccb8686fce30daee1482cfaf46aed -
Branch / Tag:
refs/tags/v0.2.8 - Owner: https://github.com/hy-token
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release-pypi.yml@c9533ca6de1ccb8686fce30daee1482cfaf46aed -
Trigger Event:
workflow_dispatch
-
Statement type:
File details
Details for the file liel-0.2.8-cp39-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.
File metadata
- Download URL: liel-0.2.8-cp39-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
- Upload date:
- Size: 501.5 kB
- Tags: CPython 3.9+, manylinux: glibc 2.17+ x86-64
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7bbc22883c6215b22daa71737af371eb09be8550e59d45f035d9c446430cb392
|
|
| MD5 |
26fadd5598f1d3696ff07d9ff427e235
|
|
| BLAKE2b-256 |
40f983867a1cb705db6e2c2948e742ac64aa841a51ef711520bf50ea4aa85fa7
|
Provenance
The following attestation bundles were made for liel-0.2.8-cp39-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl:
Publisher:
release-pypi.yml on hy-token/liel
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
liel-0.2.8-cp39-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl -
Subject digest:
7bbc22883c6215b22daa71737af371eb09be8550e59d45f035d9c446430cb392 - Sigstore transparency entry: 1391439333
- Sigstore integration time:
-
Permalink:
hy-token/liel@c9533ca6de1ccb8686fce30daee1482cfaf46aed -
Branch / Tag:
refs/tags/v0.2.8 - Owner: https://github.com/hy-token
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release-pypi.yml@c9533ca6de1ccb8686fce30daee1482cfaf46aed -
Trigger Event:
workflow_dispatch
-
Statement type:
File details
Details for the file liel-0.2.8-cp39-abi3-macosx_11_0_arm64.whl.
File metadata
- Download URL: liel-0.2.8-cp39-abi3-macosx_11_0_arm64.whl
- Upload date:
- Size: 452.9 kB
- Tags: CPython 3.9+, macOS 11.0+ ARM64
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2e2cfe80d93941373d898aeda3cc8549e2771ed0cbf9338be59dd99a2150308d
|
|
| MD5 |
61baef9c9b49aca9df51df9da4f16e97
|
|
| BLAKE2b-256 |
83f9c0eb6a03abe9dd4dbcd9e1775caa47e65e3dff74dc7b2af75c0bf90d8137
|
Provenance
The following attestation bundles were made for liel-0.2.8-cp39-abi3-macosx_11_0_arm64.whl:
Publisher:
release-pypi.yml on hy-token/liel
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
liel-0.2.8-cp39-abi3-macosx_11_0_arm64.whl -
Subject digest:
2e2cfe80d93941373d898aeda3cc8549e2771ed0cbf9338be59dd99a2150308d - Sigstore transparency entry: 1391439346
- Sigstore integration time:
-
Permalink:
hy-token/liel@c9533ca6de1ccb8686fce30daee1482cfaf46aed -
Branch / Tag:
refs/tags/v0.2.8 - Owner: https://github.com/hy-token
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release-pypi.yml@c9533ca6de1ccb8686fce30daee1482cfaf46aed -
Trigger Event:
workflow_dispatch
-
Statement type: