Skip to main content

Memory infrastructure for AI agents — citations, audit trails, bi-temporal versioning, BYO-LLM

Project description

Aurra Python SDK

Memory infrastructure for AI agents. Stores conversation memories with source citations, full audit trails, and Bring-Your-Own-LLM extraction.

pip install aurra

Quick start

from aurra import Aurra

client = Aurra(api_key="aurra_...")

# Add memories from a dialog (Aurra extracts atomic facts via LLM)
result = client.memories.add(
    messages=[
        {"role": "user", "content": "Hi, I'm Alice. I play pickleball every Tuesday."},
        {"role": "assistant", "content": "Nice to meet you Alice!"},
    ],
    session_id="user_alice_session_1",
)
print(f"Saved {result.saved_count} memories: {result.memory_ids}")

# Read memories — every record includes a source_citation object
for m in client.memories.list(limit=5):
    print(m["topic"], "—", m["summary"])
    print("  source:", m["source_citation"]["type"], "/", m["source_citation"]["channel"])

# Ask a question — Aurra runs semantic search + LLM answer
result = client.memories.query("What does Alice do on Tuesdays?")
print(result.answer)

Bring Your Own LLM

By default, Aurra extracts memories using its hosted Anthropic key. Pass an llm config to use your own provider + key per request:

# Extract with your own OpenAI key
client.memories.add(
    messages=[...],
    session_id="...",
    llm={
        "provider": "openai",
        "api_key": "sk-proj-...",
        "model": "gpt-4o",
    },
)

# Or your own Anthropic key
client.memories.add(
    messages=[...],
    session_id="...",
    llm={
        "provider": "anthropic",
        "api_key": "sk-ant-...",
        "model": "claude-opus-4-5",
    },
)

Supported providers today: anthropic, openai. User-supplied keys are never logged or stored.

Source citations

Every memory comes back with a source_citation object so you can trace where it came from:

mem = client.memories.list(limit=1)[0]
print(mem["source_citation"])
# {
#   "type": "agent_session",
#   "channel": "user_alice_session_1",
#   "captured_at": "2026-04-30T22:31:09.123456+00:00",
#   "session_id": "user_alice_session_1",
#   "original_message": "User: Hi, I'm Alice...\nAssistant: Nice to meet you!"
# }

Audit trails

Get the full provenance + extraction metadata + history for any memory:

audit = client.memories.get_audit(memory_id="aa5edf37-0475-4e40-bcd5-17c2557f4d6a")
print(audit["provenance"]["original_input"])
print(audit["extraction"]["model"])           # "claude-opus-4-5"
print(audit["history"])                        # [{"event": "created", "at": "...", "by": "aurra-extractor"}]

Errors

Typed exceptions for predictable handling:

from aurra import (
    AurraAuthError,           # 401: bad Aurra key
    AurraValidationError,     # 400: bad request shape
    AurraNotFoundError,       # 404: memory not found
    AurraRateLimitError,      # 429: Aurra rate limit
    AurraLLMProviderError,    # BYO-LLM: provider rejected (bad key, model not found, etc.)
    AurraServerError,         # 5xx: Aurra backend error
    AurraConnectionError,     # network: couldn't reach Aurra
    AurraError,               # base class for all of the above
)

try:
    client.memories.add(
        messages=[{"role": "user", "content": "test"}],
        session_id="s",
        llm={"provider": "openai", "api_key": "sk-bad", "model": "gpt-4o"},
    )
except AurraLLMProviderError as e:
    print(f"{e.provider}: {e.message}")  # "openai: Invalid OpenAI API key"

Use with Claude

import anthropic
from aurra import Aurra

memory = Aurra(api_key="aurra_...")
claude = anthropic.Anthropic()

def ask_with_memory(question: str) -> str:
    context = memory.memories.query(question)
    response = claude.messages.create(
        model="claude-opus-4-5",
        max_tokens=1024,
        messages=[{
            "role": "user",
            "content": f"Context:\n{context.answer}\n\nQuestion: {question}",
        }],
    )
    return response.content[0].text

Use with OpenAI

from openai import OpenAI
from aurra import Aurra

memory = Aurra(api_key="aurra_...")
openai_client = OpenAI()

def ask_with_memory(question: str) -> str:
    context = memory.memories.query(question)
    response = openai_client.chat.completions.create(
        model="gpt-4o",
        messages=[
            {"role": "system", "content": f"Context: {context.answer}"},
            {"role": "user", "content": question},
        ],
    )
    return response.choices[0].message.content

API reference

Method Description
Aurra(api_key, base_url=None, timeout=60.0) Create a client.
client.memories.add(messages, session_id, tenant_id=None, llm=None) Extract memories from a dialog. Returns AddResult.
client.memories.add(content, topic, importance, source) Write a single memory verbatim (legacy shape).
client.memories.list(limit=20) List recent memories with citations.
client.memories.query(question, limit=10) Ask a question, get AI-answered response. Returns QueryResult.
client.memories.get_audit(memory_id) Full provenance + extraction + history for one memory.

Links

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aurra-0.3.1.tar.gz (12.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

aurra-0.3.1-py3-none-any.whl (11.9 kB view details)

Uploaded Python 3

File details

Details for the file aurra-0.3.1.tar.gz.

File metadata

  • Download URL: aurra-0.3.1.tar.gz
  • Upload date:
  • Size: 12.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.15

File hashes

Hashes for aurra-0.3.1.tar.gz
Algorithm Hash digest
SHA256 f319255f7415bea9a6dc292d0b68b7e2b89df179786b8683b36c936aea333dcf
MD5 714bde7f50e02e93eeec969e17d813df
BLAKE2b-256 735132b81b36b2a059d3ec0075e54688b3720923c110d0592bc945ba4de422a9

See more details on using hashes here.

File details

Details for the file aurra-0.3.1-py3-none-any.whl.

File metadata

  • Download URL: aurra-0.3.1-py3-none-any.whl
  • Upload date:
  • Size: 11.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.15

File hashes

Hashes for aurra-0.3.1-py3-none-any.whl
Algorithm Hash digest
SHA256 70e3cb11ecc4e53a04c7bf925422026ac011bac0f8e1ad66e5112031701ddd5a
MD5 24d44060fe0a06202df3e22f74982996
BLAKE2b-256 3bed8ac5c407ea8bce4e3e6d38acd66b28e9883a613b545833ae96732cc9bf1e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page