Skip to main content

Typed persistence utilities for LangGraph-based OOAI applications.

Project description

ooai-persistence

CI Release Docs Python PDM Coverage Async Postgres

ooai-persistence provides typed persistence helpers for LangGraph-based OOAI applications.

Responsibilities

  • checkpointer configuration and backend resolution
  • store configuration and backend resolution
  • graph cache configuration
  • strict serializer allowlist support
  • sync and async persistence contexts
  • local infrastructure defaults for Postgres, optional Redis, and optional MongoDB

Quick start

pdm install -G :all
pdm run pytest
pdm run ooai-persistence smoke --backend memory

Pragmatic usage

Most applications want one of these patterns:

  • open a managed persistence bundle and use the store directly
  • compile a StateGraph with managed persistence already attached
  • bind managed persistence onto a graph you already compiled elsewhere

For a no-infrastructure memory bundle:

from ooai_persistence import AppSettings, open_sync_persistence

settings = AppSettings.memory()

with open_sync_persistence(settings) as persistence:
    persistence.store.put(("users", "will"), "profile", {"name": "Will"})
    profile = persistence.store.get(("users", "will"), "profile")

For a pragmatic LangGraph flow:

from typing import TypedDict

from langgraph.graph import END, START, StateGraph
from ooai_persistence import AppSettings, open_graph


class State(TypedDict):
    question: str
    answer: str


def respond(state: State) -> State:
    return {"answer": f"Echo: {state['question']}"}


graph = StateGraph(State)
graph.add_node("respond", respond)
graph.add_edge(START, "respond")
graph.add_edge("respond", END)

settings = AppSettings.local_sqlite(".ooai/persistence/dev.sqlite3")

async with open_graph(graph, settings) as runtime:
    await runtime.persistence.store.aput(("profiles", "demo"), "name", {"value": "Will"})
    result = await runtime.graph.ainvoke(
        {"question": "hello", "answer": ""},
        config={"configurable": {"thread_id": "demo-thread"}},
    )

When a graph uses a checkpointer, LangGraph expects a configurable.thread_id or another checkpoint key in the runnable config.

If you already have a compiled graph, bind persistence onto it:

from ooai_persistence import AppSettings, bind_graph_with_persistence, open_sync_persistence

compiled = graph.compile()

with open_sync_persistence(AppSettings.memory()) as bundle:
    persistent_graph = bind_graph_with_persistence(compiled, bundle)
    result = persistent_graph.invoke(
        {"question": "hello", "answer": ""},
        config={"configurable": {"thread_id": "demo-thread"}},
    )

LangGraph wrappers

The top-level graph helpers are:

  • compile_graph_with_persistence(graph, bundle, **compile_kwargs)
  • bind_graph_with_persistence(compiled_graph, bundle)
  • open_sync_graph(graph_or_compiled_graph, settings, **compile_kwargs)
  • open_graph(graph_or_compiled_graph, settings, **compile_kwargs)

open_sync_graph and open_graph yield a PersistentGraph with:

  • runtime.graph: the compiled or rebound LangGraph
  • runtime.persistence: the managed PersistenceBundle

CLI

The package ships a small diagnostics and smoke-test CLI:

ooai-persistence doctor --backend postgres --json
ooai-persistence smoke --backend memory
ooai-persistence smoke --backend sqlite --sqlite-path .ooai/persistence/smoke.sqlite3
ooai-persistence smoke --backend postgres --async
ooai-persistence env --output .env
ooai-persistence ensure-postgres

The async Postgres smoke command exercises both the LangGraph checkpointer and store through the public context API.

Local Postgres

make bootstrap
make infra-up
make infra-test-postgres
make infra-down

make bootstrap creates .env from .env.example when needed and installs all PDM extras. See infra/README.md for Compose details.

make infra-test-postgres runs the real async Postgres E2E path and the CLI smoke path against Docker Compose Postgres.

If Docker is not installed but .env points at a reachable Postgres server, make infra-up falls back to ooai-persistence ensure-postgres and creates the configured database if needed.

Default backend behavior

By default, checkpointer and store use backend="auto".

auto resolves in this order:

  1. async Postgres when configured
  2. async SQLite when configured
  3. async MongoDB when configured
  4. async Redis when configured
  5. in-memory fallback

Serializer allowlist registry

The package includes a reusable strict-msgpack registry:

from ooai_persistence.serde.registry import MsgpackAllowlistRegistry

registry = MsgpackAllowlistRegistry()
registry.register_symbol("my_app.models", "WorkflowState")
registry.register_type(MyPersistedModel)
registry.register_import_string("my_app.models:AnotherPersistedModel")

That registry can be passed into the persistence context:

from ooai_persistence import AppSettings, open_persistence

settings = AppSettings()

async with open_persistence(settings, registry=registry) as bundle:
    ...

The same registry also flows through open_graph(...) and open_sync_graph(...).

Documentation and release checks

pdm run sphinx-build -W -b html docs docs/_build/html
pdm build

CI runs formatting, linting, typing, tests with coverage, and the Sphinx build. Docs also publish from main to GitHub Pages.

Releasing

Releases are tag-driven:

pdm lock --check
make check
git tag v0.2.2
git push origin v0.2.2

The release workflow verifies that the tag matches pyproject.toml, runs the async Postgres E2E checks, builds the wheel/sdist, publishes through PyPI Trusted Publishing, and creates a GitHub Release with artifacts attached.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ooai_persistence-0.2.3.tar.gz (22.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ooai_persistence-0.2.3-py3-none-any.whl (21.0 kB view details)

Uploaded Python 3

File details

Details for the file ooai_persistence-0.2.3.tar.gz.

File metadata

  • Download URL: ooai_persistence-0.2.3.tar.gz
  • Upload date:
  • Size: 22.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for ooai_persistence-0.2.3.tar.gz
Algorithm Hash digest
SHA256 b8a56ec6258e86abd6c1ae971968201b85072415731a289bbb08c05e3e572f48
MD5 e184a07c5381b57110b8c81bcb7ad58b
BLAKE2b-256 2b3b288fca26303d96320398efc88d6154a2bdd34827e4de6d7fcb759e3835d2

See more details on using hashes here.

Provenance

The following attestation bundles were made for ooai_persistence-0.2.3.tar.gz:

Publisher: release.yml on pr1m8/ooai-persistence

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ooai_persistence-0.2.3-py3-none-any.whl.

File metadata

File hashes

Hashes for ooai_persistence-0.2.3-py3-none-any.whl
Algorithm Hash digest
SHA256 6f86629a43d33b07af895476ec4e4676e2371332df997bc943342057d8a9f22d
MD5 7eac4856fe9f654c9522ff09b71ce8ed
BLAKE2b-256 d313e5592c8917c507de1ea5dbc025a00dc9969adcd3fa9eabbde538eaf09da9

See more details on using hashes here.

Provenance

The following attestation bundles were made for ooai_persistence-0.2.3-py3-none-any.whl:

Publisher: release.yml on pr1m8/ooai-persistence

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page