Skip to main content

Typed persistence utilities for LangGraph-based OOAI applications.

Project description

ooai-persistence

CI Release Docs Python PDM Coverage Async Postgres

ooai-persistence gives LangGraph apps a usable persistence layer without making you hand-build a big settings tree first.

Responsibilities

  • checkpointer configuration and backend resolution
  • store configuration and backend resolution
  • graph cache configuration
  • strict serializer allowlist support
  • sync and async persistence contexts
  • local infrastructure defaults for Postgres, optional Redis, and optional MongoDB

Quick start

pdm install -G :all
pdm run pytest
pdm run ooai-persistence smoke --backend memory

Start here

Most applications should start with one of these helpers:

  • memory_settings() for tests and local no-infra runs
  • sqlite_settings(path) for one-file local persistence
  • postgres_settings(...) for the real async Postgres path
from ooai_persistence import memory_settings, postgres_settings, sqlite_settings

memory = memory_settings()
sqlite = sqlite_settings(".ooai/persistence/dev.sqlite3")
postgres = postgres_settings(database="ooai_persistence")
postgres_via_uri = postgres_settings("postgresql://postgres:postgres@localhost:5442/ooai_persistence?sslmode=disable")

If those cover your case, you do not need to construct AppSettings(...) directly.

If you want the shortest possible path, skip settings entirely and open the bundle directly:

from ooai_persistence import (
    open_memory_persistence,
    open_postgres_persistence,
    open_sqlite_persistence,
)

async with open_postgres_persistence(database="ooai_persistence") as persistence:
    await persistence.store.aput(("profiles", "demo"), "name", {"value": "Will"})

That is the easiest async entrypoint in the package right now.

If you only want the long-term store and not the full persistence bundle:

from ooai_persistence import open_postgres_store

async with open_postgres_store(database="ooai_persistence") as store:
    await store.aput(("profiles", "demo"), "name", {"value": "Will"})
    item = await store.aget(("profiles", "demo"), "name")

Common patterns

1. Use the store directly

from ooai_persistence import open_sync_memory_store

with open_sync_memory_store() as store:
    store.put(("users", "will"), "profile", {"name": "Will"})
    profile = store.get(("users", "will"), "profile")

2. Compile a graph with async persistence attached

from typing import TypedDict

from langgraph.graph import END, START, StateGraph
from ooai_persistence import open_graph, postgres_settings


class State(TypedDict):
    question: str
    answer: str


def respond(state: State) -> State:
    return {"answer": f"Echo: {state['question']}"}


graph = StateGraph(State)
graph.add_node("respond", respond)
graph.add_edge(START, "respond")
graph.add_edge("respond", END)

settings = postgres_settings(
    host="localhost",
    port=5442,
    database="ooai_persistence",
    user="postgres",
    password="postgres",
)

async with open_graph(graph, settings) as runtime:
    await runtime.persistence.store.aput(("profiles", "demo"), "name", {"value": "Will"})
    result = await runtime.graph.ainvoke(
        {"question": "hello", "answer": ""},
        config={"configurable": {"thread_id": "demo-thread"}},
    )

When a graph uses a checkpointer, LangGraph expects a configurable.thread_id or another checkpoint key in the runnable config.

3. Bind persistence onto a compiled graph

from ooai_persistence import bind_graph_with_persistence, memory_settings, open_sync_persistence

compiled = graph.compile()

with open_sync_persistence(memory_settings()) as bundle:
    persistent_graph = bind_graph_with_persistence(compiled, bundle)
    result = persistent_graph.invoke(
        {"question": "hello", "answer": ""},
        config={"configurable": {"thread_id": "demo-thread"}},
    )

Async Postgres, the easy way

If your real target is async Postgres, the shortest path is:

from ooai_persistence import open_postgres_persistence

async with open_postgres_persistence(database="ooai_persistence") as persistence:
    await persistence.store.aput(("profiles", "demo"), "name", {"value": "Will"})

If you are compiling a LangGraph too:

from ooai_persistence import open_graph, postgres_settings

settings = postgres_settings(database="ooai_persistence")

Or use a URI:

from ooai_persistence import open_postgres_persistence

async with open_postgres_persistence(
    "postgresql://postgres:postgres@localhost:5442/ooai_persistence?sslmode=disable"
) as persistence:
    ...

If you call the Postgres helpers without connection arguments, they read the same OOAI_PERSISTENCE_INFRA__POSTGRES_* settings that AppSettings() uses.

Then bring Postgres up locally:

make bootstrap
make infra-up
make infra-test-postgres

That path exercises the real async LangGraph checkpointer and store, not a fake shim around them.

LangGraph wrappers

The top-level graph helpers are:

  • compile_graph_with_persistence(graph, bundle, **compile_kwargs)
  • bind_graph_with_persistence(compiled_graph, bundle)
  • open_sync_graph(graph_or_compiled_graph, settings, **compile_kwargs)
  • open_graph(graph_or_compiled_graph, settings, **compile_kwargs)

The top-level persistence helpers are:

  • open_sync_memory_persistence()
  • open_sync_sqlite_persistence(path)
  • open_sync_postgres_persistence(...)
  • open_memory_persistence()
  • open_sqlite_persistence(path)
  • open_postgres_persistence(...)

The top-level store-only helpers are:

  • open_sync_store(settings)
  • open_sync_memory_store()
  • open_sync_sqlite_store(path)
  • open_sync_postgres_store(...)
  • open_store(settings)
  • open_memory_store()
  • open_sqlite_store(path)
  • open_postgres_store(...)

open_sync_graph and open_graph yield a PersistentGraph with:

  • runtime.graph: the compiled or rebound LangGraph
  • runtime.persistence: the managed PersistenceBundle

CLI

The package ships a small diagnostics and smoke-test CLI:

ooai-persistence doctor --backend postgres --json
ooai-persistence smoke --backend memory
ooai-persistence smoke --backend sqlite --sqlite-path .ooai/persistence/smoke.sqlite3
ooai-persistence smoke --backend postgres --async
ooai-persistence env --output .env
ooai-persistence ensure-postgres

The async Postgres smoke command exercises both the LangGraph checkpointer and store through the public context API.

Local Postgres

make bootstrap
make infra-up
make infra-up-docker
make infra-test-postgres
make infra-down

make bootstrap creates .env from .env.example when needed and installs all PDM extras. See infra/README.md for Compose details.

make infra-test-postgres remains as a compatibility alias for the same Postgres E2E and CLI smoke flow.

If you want the shortest test commands, use:

make test-e2e-memory
make test-e2e-sqlite
make test-e2e-local
make test-e2e-postgres

The Postgres target brings up the configured database, runs the public wrapper E2E suite for open_postgres_persistence(...) and open_postgres_store(...), and then runs the async CLI smoke check.

make infra-up is the ergonomic default: it uses the Postgres server from .env, ensures the configured database exists, and avoids depending on Docker just to run the async store and persistence tests locally.

If you want the Compose-backed service explicitly, use make infra-up-docker.

The matching .env path is already laid out in .env.example.

Default backend behavior

By default, checkpointer and store use backend="auto".

auto resolves in this order:

  1. async Postgres when configured
  2. async SQLite when configured
  3. async MongoDB when configured
  4. async Redis when configured
  5. in-memory fallback

Serializer allowlist registry

The package includes a reusable strict-msgpack registry:

from ooai_persistence.serde.registry import MsgpackAllowlistRegistry

registry = MsgpackAllowlistRegistry()
registry.register_symbol("my_app.models", "WorkflowState")
registry.register_type(MyPersistedModel)
registry.register_import_string("my_app.models:AnotherPersistedModel")

That registry can be passed into the persistence context:

from ooai_persistence import AppSettings, open_persistence

settings = AppSettings()

async with open_persistence(settings, registry=registry) as bundle:
    ...

The same registry also flows through open_graph(...) and open_sync_graph(...). It also flows through the direct-open helpers like open_postgres_persistence(...).

When to use AppSettings directly

Reach for AppSettings(...) only when you want to:

  • override checkpointer and store backends separately
  • drive config from .env
  • customize serializer allowlists, cache settings, or infra defaults
  • compose persistence settings into a larger application settings object

Documentation and release checks

pdm run sphinx-build -W -b html docs docs/_build/html
pdm build

CI runs formatting, linting, typing, tests with coverage, and the Sphinx build. Docs also publish from main to GitHub Pages.

Releasing

Releases are tag-driven:

pdm lock --check
make check
git tag vX.Y.Z
git push origin vX.Y.Z

The release workflow verifies that the tag matches pyproject.toml, runs the async Postgres E2E checks, builds the wheel/sdist, publishes through PyPI Trusted Publishing, and creates a GitHub Release with artifacts attached.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ooai_persistence-0.2.5.tar.gz (27.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ooai_persistence-0.2.5-py3-none-any.whl (23.6 kB view details)

Uploaded Python 3

File details

Details for the file ooai_persistence-0.2.5.tar.gz.

File metadata

  • Download URL: ooai_persistence-0.2.5.tar.gz
  • Upload date:
  • Size: 27.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for ooai_persistence-0.2.5.tar.gz
Algorithm Hash digest
SHA256 97775071ee47f324b0a2fc2051c4c16c08f3711d4c27c91bbc7a43d3c926eac4
MD5 c9250d4b2fb37b093c0b8c153afaf83d
BLAKE2b-256 76279d6b96e3408f697592add7688d29ba1dc25555ebb784f4072ecee37b3cd8

See more details on using hashes here.

Provenance

The following attestation bundles were made for ooai_persistence-0.2.5.tar.gz:

Publisher: release.yml on pr1m8/ooai-persistence

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ooai_persistence-0.2.5-py3-none-any.whl.

File metadata

File hashes

Hashes for ooai_persistence-0.2.5-py3-none-any.whl
Algorithm Hash digest
SHA256 6c74ed97e94ba230477c8d240df9364fee48c48af9bab3df29689503f1528c32
MD5 8ee8b8f126f8b8298b3c793cf2e2fe52
BLAKE2b-256 c054c271d12ac365bb83e8a97e8a7d892ec77d6f79dcb35277817cad090566fa

See more details on using hashes here.

Provenance

The following attestation bundles were made for ooai_persistence-0.2.5-py3-none-any.whl:

Publisher: release.yml on pr1m8/ooai-persistence

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page