Typed persistence utilities for LangGraph-based OOAI applications.
Project description
ooai-persistence
ooai-persistence gives LangGraph apps a usable persistence layer without
making you hand-build a big settings tree first.
Responsibilities
- checkpointer configuration and backend resolution
- store configuration and backend resolution
- graph cache configuration
- strict serializer allowlist support
- sync and async persistence contexts
- local infrastructure defaults for Postgres, optional Redis, and optional MongoDB
Quick start
pdm install -G :all
pdm run pytest
pdm run ooai-persistence smoke --backend memory
Start here
Most applications should start with one of these helpers:
memory_settings()for tests and local no-infra runssqlite_settings(path)for one-file local persistencepostgres_settings(...)for the real async Postgres path
from ooai_persistence import memory_settings, postgres_settings, sqlite_settings
memory = memory_settings()
sqlite = sqlite_settings(".ooai/persistence/dev.sqlite3")
postgres = postgres_settings(database="ooai_persistence")
postgres_via_uri = postgres_settings("postgresql://postgres:postgres@localhost:5442/ooai_persistence?sslmode=disable")
If those cover your case, you do not need to construct AppSettings(...)
directly.
If you want the shortest possible path, skip settings entirely and open the bundle directly:
from ooai_persistence import (
open_memory_persistence,
open_postgres_persistence,
open_sqlite_persistence,
)
async with open_postgres_persistence(database="ooai_persistence") as persistence:
await persistence.store.aput(("profiles", "demo"), "name", {"value": "Will"})
That is the easiest async entrypoint in the package right now.
If you only want the long-term store and not the full persistence bundle:
from ooai_persistence import open_postgres_store
async with open_postgres_store(database="ooai_persistence") as store:
await store.aput(("profiles", "demo"), "name", {"value": "Will"})
item = await store.aget(("profiles", "demo"), "name")
Common patterns
1. Use the store directly
from ooai_persistence import open_sync_memory_store
with open_sync_memory_store() as store:
store.put(("users", "will"), "profile", {"name": "Will"})
profile = store.get(("users", "will"), "profile")
2. Compile a graph with async persistence attached
from typing import TypedDict
from langgraph.graph import END, START, StateGraph
from ooai_persistence import open_graph, postgres_settings
class State(TypedDict):
question: str
answer: str
def respond(state: State) -> State:
return {"answer": f"Echo: {state['question']}"}
graph = StateGraph(State)
graph.add_node("respond", respond)
graph.add_edge(START, "respond")
graph.add_edge("respond", END)
settings = postgres_settings(
host="localhost",
port=5442,
database="ooai_persistence",
user="postgres",
password="postgres",
)
async with open_graph(graph, settings) as runtime:
await runtime.persistence.store.aput(("profiles", "demo"), "name", {"value": "Will"})
result = await runtime.graph.ainvoke(
{"question": "hello", "answer": ""},
config={"configurable": {"thread_id": "demo-thread"}},
)
When a graph uses a checkpointer, LangGraph expects a configurable.thread_id
or another checkpoint key in the runnable config.
3. Bind persistence onto a compiled graph
from ooai_persistence import bind_graph_with_persistence, memory_settings, open_sync_persistence
compiled = graph.compile()
with open_sync_persistence(memory_settings()) as bundle:
persistent_graph = bind_graph_with_persistence(compiled, bundle)
result = persistent_graph.invoke(
{"question": "hello", "answer": ""},
config={"configurable": {"thread_id": "demo-thread"}},
)
Async Postgres, the easy way
If your real target is async Postgres, the shortest path is:
from ooai_persistence import open_postgres_persistence
async with open_postgres_persistence(database="ooai_persistence") as persistence:
await persistence.store.aput(("profiles", "demo"), "name", {"value": "Will"})
If you are compiling a LangGraph too:
from ooai_persistence import open_graph, postgres_settings
settings = postgres_settings(database="ooai_persistence")
Or use a URI:
from ooai_persistence import open_postgres_persistence
async with open_postgres_persistence(
"postgresql://postgres:postgres@localhost:5442/ooai_persistence?sslmode=disable"
) as persistence:
...
If you call the Postgres helpers without connection arguments, they read the
same OOAI_PERSISTENCE_INFRA__POSTGRES_* settings that AppSettings() uses.
Then bring Postgres up locally:
make bootstrap
make infra-up
make infra-test-postgres
That path exercises the real async LangGraph checkpointer and store, not a fake shim around them.
LangGraph wrappers
The top-level graph helpers are:
compile_graph_with_persistence(graph, bundle, **compile_kwargs)bind_graph_with_persistence(compiled_graph, bundle)open_sync_graph(graph_or_compiled_graph, settings, **compile_kwargs)open_graph(graph_or_compiled_graph, settings, **compile_kwargs)
The top-level persistence helpers are:
open_sync_memory_persistence()open_sync_sqlite_persistence(path)open_sync_postgres_persistence(...)open_memory_persistence()open_sqlite_persistence(path)open_postgres_persistence(...)
The top-level store-only helpers are:
open_sync_store(settings)open_sync_memory_store()open_sync_sqlite_store(path)open_sync_postgres_store(...)open_store(settings)open_memory_store()open_sqlite_store(path)open_postgres_store(...)
open_sync_graph and open_graph yield a PersistentGraph with:
runtime.graph: the compiled or rebound LangGraphruntime.persistence: the managedPersistenceBundle
CLI
The package ships a small diagnostics and smoke-test CLI:
ooai-persistence doctor --backend postgres --json
ooai-persistence smoke --backend memory
ooai-persistence smoke --backend sqlite --sqlite-path .ooai/persistence/smoke.sqlite3
ooai-persistence smoke --backend postgres --async
ooai-persistence env --output .env
ooai-persistence ensure-postgres
The async Postgres smoke command exercises both the LangGraph checkpointer and store through the public context API.
Local Postgres
make bootstrap
make infra-up
make infra-up-docker
make infra-test-postgres
make infra-down
make bootstrap creates .env from .env.example when needed and installs all
PDM extras. See infra/README.md for Compose details.
make infra-test-postgres remains as a compatibility alias for the same
Postgres E2E and CLI smoke flow.
If you want the shortest test commands, use:
make test-e2e-memory
make test-e2e-sqlite
make test-e2e-local
make test-e2e-postgres
The Postgres target brings up the configured database, runs the public wrapper
E2E suite for open_postgres_persistence(...) and open_postgres_store(...),
and then runs the async CLI smoke check.
make infra-up is the ergonomic default: it uses the Postgres server from
.env, ensures the configured database exists, and avoids depending on Docker
just to run the async store and persistence tests locally.
If you want the Compose-backed service explicitly, use make infra-up-docker.
The matching .env path is already laid out in .env.example.
Default backend behavior
By default, checkpointer and store use backend="auto".
auto resolves in this order:
- async Postgres when configured
- async SQLite when configured
- async MongoDB when configured
- async Redis when configured
- in-memory fallback
Serializer allowlist registry
The package includes a reusable strict-msgpack registry:
from ooai_persistence.serde.registry import MsgpackAllowlistRegistry
registry = MsgpackAllowlistRegistry()
registry.register_symbol("my_app.models", "WorkflowState")
registry.register_type(MyPersistedModel)
registry.register_import_string("my_app.models:AnotherPersistedModel")
That registry can be passed into the persistence context:
from ooai_persistence import AppSettings, open_persistence
settings = AppSettings()
async with open_persistence(settings, registry=registry) as bundle:
...
The same registry also flows through open_graph(...) and open_sync_graph(...).
It also flows through the direct-open helpers like open_postgres_persistence(...).
When to use AppSettings directly
Reach for AppSettings(...) only when you want to:
- override checkpointer and store backends separately
- drive config from
.env - customize serializer allowlists, cache settings, or infra defaults
- compose persistence settings into a larger application settings object
Documentation and release checks
pdm run sphinx-build -W -b html docs docs/_build/html
pdm build
CI runs formatting, linting, typing, tests with coverage, and the Sphinx build.
Docs also publish from main to GitHub Pages.
Releasing
Releases are tag-driven:
pdm lock --check
make check
git tag vX.Y.Z
git push origin vX.Y.Z
The release workflow verifies that the tag matches pyproject.toml, runs the
async Postgres E2E checks, builds the wheel/sdist, publishes through PyPI
Trusted Publishing, and creates a GitHub Release with artifacts attached.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file ooai_persistence-0.2.5.tar.gz.
File metadata
- Download URL: ooai_persistence-0.2.5.tar.gz
- Upload date:
- Size: 27.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
97775071ee47f324b0a2fc2051c4c16c08f3711d4c27c91bbc7a43d3c926eac4
|
|
| MD5 |
c9250d4b2fb37b093c0b8c153afaf83d
|
|
| BLAKE2b-256 |
76279d6b96e3408f697592add7688d29ba1dc25555ebb784f4072ecee37b3cd8
|
Provenance
The following attestation bundles were made for ooai_persistence-0.2.5.tar.gz:
Publisher:
release.yml on pr1m8/ooai-persistence
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
ooai_persistence-0.2.5.tar.gz -
Subject digest:
97775071ee47f324b0a2fc2051c4c16c08f3711d4c27c91bbc7a43d3c926eac4 - Sigstore transparency entry: 1439935163
- Sigstore integration time:
-
Permalink:
pr1m8/ooai-persistence@509bbdce4855388248113c932445cb5955945083 -
Branch / Tag:
refs/tags/v0.2.5 - Owner: https://github.com/pr1m8
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@509bbdce4855388248113c932445cb5955945083 -
Trigger Event:
push
-
Statement type:
File details
Details for the file ooai_persistence-0.2.5-py3-none-any.whl.
File metadata
- Download URL: ooai_persistence-0.2.5-py3-none-any.whl
- Upload date:
- Size: 23.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6c74ed97e94ba230477c8d240df9364fee48c48af9bab3df29689503f1528c32
|
|
| MD5 |
8ee8b8f126f8b8298b3c793cf2e2fe52
|
|
| BLAKE2b-256 |
c054c271d12ac365bb83e8a97e8a7d892ec77d6f79dcb35277817cad090566fa
|
Provenance
The following attestation bundles were made for ooai_persistence-0.2.5-py3-none-any.whl:
Publisher:
release.yml on pr1m8/ooai-persistence
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
ooai_persistence-0.2.5-py3-none-any.whl -
Subject digest:
6c74ed97e94ba230477c8d240df9364fee48c48af9bab3df29689503f1528c32 - Sigstore transparency entry: 1439935173
- Sigstore integration time:
-
Permalink:
pr1m8/ooai-persistence@509bbdce4855388248113c932445cb5955945083 -
Branch / Tag:
refs/tags/v0.2.5 - Owner: https://github.com/pr1m8
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@509bbdce4855388248113c932445cb5955945083 -
Trigger Event:
push
-
Statement type: