Skip to main content

Memory infrastructure for AI agents — citations, audit trails, BYO-LLM

Project description

Aurra Python SDK

Memory infrastructure for AI agents. Stores conversation memories with source citations, full audit trails, and Bring-Your-Own-LLM extraction.

pip install aurra

Quick start

from aurra import Aurra

client = Aurra(api_key="aurra_...")

# Add memories from a dialog (Aurra extracts atomic facts via LLM)
result = client.memories.add(
    messages=[
        {"role": "user", "content": "Hi, I'm Alice. I play pickleball every Tuesday."},
        {"role": "assistant", "content": "Nice to meet you Alice!"},
    ],
    session_id="user_alice_session_1",
)
print(f"Saved {result.saved_count} memories: {result.memory_ids}")

# Read memories — every record includes a source_citation object
for m in client.memories.list(limit=5):
    print(m["topic"], "—", m["summary"])
    print("  source:", m["source_citation"]["type"], "/", m["source_citation"]["channel"])

# Ask a question — Aurra runs semantic search + LLM answer
result = client.memories.query("What does Alice do on Tuesdays?")
print(result.answer)

Bring Your Own LLM

By default, Aurra extracts memories using its hosted Anthropic key. Pass an llm config to use your own provider + key per request:

# Extract with your own OpenAI key
client.memories.add(
    messages=[...],
    session_id="...",
    llm={
        "provider": "openai",
        "api_key": "sk-proj-...",
        "model": "gpt-4o",
    },
)

# Or your own Anthropic key
client.memories.add(
    messages=[...],
    session_id="...",
    llm={
        "provider": "anthropic",
        "api_key": "sk-ant-...",
        "model": "claude-opus-4-5",
    },
)

Supported providers today: anthropic, openai. User-supplied keys are never logged or stored.

Source citations

Every memory comes back with a source_citation object so you can trace where it came from:

mem = client.memories.list(limit=1)[0]
print(mem["source_citation"])
# {
#   "type": "agent_session",
#   "channel": "user_alice_session_1",
#   "captured_at": "2026-04-30T22:31:09.123456+00:00",
#   "session_id": "user_alice_session_1",
#   "original_message": "User: Hi, I'm Alice...\nAssistant: Nice to meet you!"
# }

Audit trails

Get the full provenance + extraction metadata + history for any memory:

audit = client.memories.get_audit(memory_id="aa5edf37-0475-4e40-bcd5-17c2557f4d6a")
print(audit["provenance"]["original_input"])
print(audit["extraction"]["model"])           # "claude-opus-4-5"
print(audit["history"])                        # [{"event": "created", "at": "...", "by": "aurra-extractor"}]

Errors

Typed exceptions for predictable handling:

from aurra import (
    AurraAuthError,           # 401: bad Aurra key
    AurraValidationError,     # 400: bad request shape
    AurraNotFoundError,       # 404: memory not found
    AurraRateLimitError,      # 429: Aurra rate limit
    AurraLLMProviderError,    # BYO-LLM: provider rejected (bad key, model not found, etc.)
    AurraServerError,         # 5xx: Aurra backend error
    AurraConnectionError,     # network: couldn't reach Aurra
    AurraError,               # base class for all of the above
)

try:
    client.memories.add(
        messages=[{"role": "user", "content": "test"}],
        session_id="s",
        llm={"provider": "openai", "api_key": "sk-bad", "model": "gpt-4o"},
    )
except AurraLLMProviderError as e:
    print(f"{e.provider}: {e.message}")  # "openai: Invalid OpenAI API key"

Use with Claude

import anthropic
from aurra import Aurra

memory = Aurra(api_key="aurra_...")
claude = anthropic.Anthropic()

def ask_with_memory(question: str) -> str:
    context = memory.memories.query(question)
    response = claude.messages.create(
        model="claude-opus-4-5",
        max_tokens=1024,
        messages=[{
            "role": "user",
            "content": f"Context:\n{context.answer}\n\nQuestion: {question}",
        }],
    )
    return response.content[0].text

Use with OpenAI

from openai import OpenAI
from aurra import Aurra

memory = Aurra(api_key="aurra_...")
openai_client = OpenAI()

def ask_with_memory(question: str) -> str:
    context = memory.memories.query(question)
    response = openai_client.chat.completions.create(
        model="gpt-4o",
        messages=[
            {"role": "system", "content": f"Context: {context.answer}"},
            {"role": "user", "content": question},
        ],
    )
    return response.choices[0].message.content

API reference

Method Description
Aurra(api_key, base_url=None, timeout=60.0) Create a client.
client.memories.add(messages, session_id, tenant_id=None, llm=None) Extract memories from a dialog. Returns AddResult.
client.memories.add(content, topic, importance, source) Write a single memory verbatim (legacy shape).
client.memories.list(limit=20) List recent memories with citations.
client.memories.query(question, limit=10) Ask a question, get AI-answered response. Returns QueryResult.
client.memories.get_audit(memory_id) Full provenance + extraction + history for one memory.

Links

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aurra-0.2.0.tar.gz (7.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

aurra-0.2.0-py3-none-any.whl (8.1 kB view details)

Uploaded Python 3

File details

Details for the file aurra-0.2.0.tar.gz.

File metadata

  • Download URL: aurra-0.2.0.tar.gz
  • Upload date:
  • Size: 7.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.15

File hashes

Hashes for aurra-0.2.0.tar.gz
Algorithm Hash digest
SHA256 4414f386d082dabc412902db4004a131f1d8a96bd1f491a1d90a580de125ad5d
MD5 73d7d6ab4e9bb904426912081e12f152
BLAKE2b-256 2fbbe78d41f7bd3d8bdf5404d9162c8b8d174faefcb057ad4caccc2b6361b8e4

See more details on using hashes here.

File details

Details for the file aurra-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: aurra-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 8.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.15

File hashes

Hashes for aurra-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 908af55bfce7452911892885e6790242f9958f199397998e412af5781c4803c7
MD5 e2d2e30d1b77cc8f0231f3e5b91a9620
BLAKE2b-256 1e7e1ca79fda6d345c59811518e0d2937bc5a97f245d8c4c070684fff1d44ef2

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page