Skip to main content

Core memory engine for Z3rno: PostgreSQL schema, SQLAlchemy models, Alembic migrations, and the store/recall/forget/audit library functions.

Project description

z3rno-core

PyPI License: Apache 2.0 CI

Core memory engine for Z3rno -- PostgreSQL schema, SQLAlchemy models, Alembic migrations, and the store/recall/forget/audit library.

Features

  • 7 relational tables -- tenants, API keys, agents, memories, memory relationships, lifecycle policies, audit log
  • 15 Alembic migrations -- fully versioned schema evolution
  • Row-Level Security (RLS) -- multi-tenant isolation at the database level
  • SCD Type 2 versioning -- temporal history for every memory mutation
  • pgvector HNSW indexing -- fast approximate nearest-neighbor vector search
  • Apache AGE graph layer -- entity and relationship traversal via Cypher queries
  • Hash-chained audit log -- tamper-evident record of every store, recall, forget, and update

Quickstart

Run migrations

# Set your database URL
export DATABASE_URL="postgresql://z3rno:password@localhost:5432/z3rno"

# Install
pip install -e .

# Apply all migrations
alembic upgrade head

Use the engine

from sqlalchemy.ext.asyncio import create_async_engine, async_sessionmaker
from z3rno_core.engine import store, recall, forget, audit

engine = create_async_engine("postgresql+asyncpg://z3rno:password@localhost/z3rno")
Session = async_sessionmaker(engine)

async with Session() as db:
    # Store a memory
    memory = await store(db, org_id=org_id, agent_id="agent-1", content="User prefers dark mode")

    # Recall by semantic similarity
    results = await recall(db, org_id=org_id, agent_id="agent-1", query="user preferences", top_k=5)

    # Soft-delete a memory
    await forget(db, org_id=org_id, memory_id=memory.id)

    # Query the audit trail
    entries = await audit(db, org_id=org_id, agent_id="agent-1")

For a detailed step-by-step setup, see QUICKSTART.md.

Full documentation: astron-bb4261fd.mintlify.app

Architecture

The schema follows a strict dependency order:

tenants -> api_keys -> agents -> memories -> memory_relationships
                                    |
                                    +-> lifecycle_policies
                                    +-> audit_log

All engine functions operate within a single async SQLAlchemy session and respect RLS via SET LOCAL on the org context.

Documentation

Development

uv sync --dev
uv run ruff check .
uv run mypy .
uv run pytest

See CONTRIBUTING.md for the full workflow.

License

Apache 2.0 -- see LICENSE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

z3rno_core-0.2.0.tar.gz (96.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

z3rno_core-0.2.0-py3-none-any.whl (55.1 kB view details)

Uploaded Python 3

File details

Details for the file z3rno_core-0.2.0.tar.gz.

File metadata

  • Download URL: z3rno_core-0.2.0.tar.gz
  • Upload date:
  • Size: 96.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for z3rno_core-0.2.0.tar.gz
Algorithm Hash digest
SHA256 7156bd6c17734bc6b01f4dc5b34c086870a3ce78e0c64f7f3b24bee0e46d7d5a
MD5 155f807f8fd30dc1855e50bc002e97bd
BLAKE2b-256 03c9eb7ebb135e193cc99fd30731b4da951043e2ff73beeffbf0a9fff6b8bb2c

See more details on using hashes here.

File details

Details for the file z3rno_core-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: z3rno_core-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 55.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for z3rno_core-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 5bf36ca8f4aa1f2b94990f2e1d7d24d387425f0186f21d986c7c1ea1860520a7
MD5 5cd7a54074d873a212dda7c85287ff66
BLAKE2b-256 2d1fff33f1ff3e71ea0be12b9d67cb512c9dc4042bd689b1af646f196fb06d3b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page