Memory infrastructure for AI agents — citations, audit trails, bi-temporal versioning, BYO-LLM
Project description
Aurra Python SDK
Memory infrastructure for AI agents. Stores conversation memories with source citations, full audit trails, and Bring-Your-Own-LLM extraction.
pip install aurra
Quick start
from aurra import Aurra
client = Aurra(api_key="aurra_...")
# Add memories from a dialog (Aurra extracts atomic facts via LLM)
result = client.memories.add(
messages=[
{"role": "user", "content": "Hi, I'm Alice. I play pickleball every Tuesday."},
{"role": "assistant", "content": "Nice to meet you Alice!"},
],
session_id="user_alice_session_1",
)
print(f"Saved {result.saved_count} memories: {result.memory_ids}")
# Read memories — every record includes a source_citation object
for m in client.memories.list(limit=5):
print(m["topic"], "—", m["summary"])
print(" source:", m["source_citation"]["type"], "/", m["source_citation"]["channel"])
# Ask a question — Aurra runs semantic search + LLM answer
result = client.memories.query("What does Alice do on Tuesdays?")
print(result.answer)
Bring Your Own LLM
By default, Aurra extracts memories using its hosted Anthropic key. Pass an llm config to use your own provider + key per request:
# Extract with your own OpenAI key
client.memories.add(
messages=[...],
session_id="...",
llm={
"provider": "openai",
"api_key": "sk-proj-...",
"model": "gpt-4o",
},
)
# Or your own Anthropic key
client.memories.add(
messages=[...],
session_id="...",
llm={
"provider": "anthropic",
"api_key": "sk-ant-...",
"model": "claude-opus-4-5",
},
)
Supported providers today: anthropic, openai. User-supplied keys are never logged or stored.
Source citations
Every memory comes back with a source_citation object so you can trace where it came from:
mem = client.memories.list(limit=1)[0]
print(mem["source_citation"])
# {
# "type": "agent_session",
# "channel": "user_alice_session_1",
# "captured_at": "2026-04-30T22:31:09.123456+00:00",
# "session_id": "user_alice_session_1",
# "original_message": "User: Hi, I'm Alice...\nAssistant: Nice to meet you!"
# }
Audit trails
Get the full provenance + extraction metadata + history for any memory:
audit = client.memories.get_audit(memory_id="aa5edf37-0475-4e40-bcd5-17c2557f4d6a")
print(audit["provenance"]["original_input"])
print(audit["extraction"]["model"]) # "claude-opus-4-5"
print(audit["history"]) # [{"event": "created", "at": "...", "by": "aurra-extractor"}]
Errors
Typed exceptions for predictable handling:
from aurra import (
AurraAuthError, # 401: bad Aurra key
AurraValidationError, # 400: bad request shape
AurraNotFoundError, # 404: memory not found
AurraRateLimitError, # 429: Aurra rate limit
AurraLLMProviderError, # BYO-LLM: provider rejected (bad key, model not found, etc.)
AurraServerError, # 5xx: Aurra backend error
AurraConnectionError, # network: couldn't reach Aurra
AurraError, # base class for all of the above
)
try:
client.memories.add(
messages=[{"role": "user", "content": "test"}],
session_id="s",
llm={"provider": "openai", "api_key": "sk-bad", "model": "gpt-4o"},
)
except AurraLLMProviderError as e:
print(f"{e.provider}: {e.message}") # "openai: Invalid OpenAI API key"
Use with Claude
import anthropic
from aurra import Aurra
memory = Aurra(api_key="aurra_...")
claude = anthropic.Anthropic()
def ask_with_memory(question: str) -> str:
context = memory.memories.query(question)
response = claude.messages.create(
model="claude-opus-4-5",
max_tokens=1024,
messages=[{
"role": "user",
"content": f"Context:\n{context.answer}\n\nQuestion: {question}",
}],
)
return response.content[0].text
Use with OpenAI
from openai import OpenAI
from aurra import Aurra
memory = Aurra(api_key="aurra_...")
openai_client = OpenAI()
def ask_with_memory(question: str) -> str:
context = memory.memories.query(question)
response = openai_client.chat.completions.create(
model="gpt-4o",
messages=[
{"role": "system", "content": f"Context: {context.answer}"},
{"role": "user", "content": question},
],
)
return response.choices[0].message.content
API reference
| Method | Description |
|---|---|
Aurra(api_key, base_url=None, timeout=60.0) |
Create a client. |
client.memories.add(messages, session_id, tenant_id=None, llm=None) |
Extract memories from a dialog. Returns AddResult. |
client.memories.add(content, topic, importance, source) |
Write a single memory verbatim (legacy shape). |
client.memories.list(limit=20) |
List recent memories with citations. |
client.memories.query(question, limit=10) |
Ask a question, get AI-answered response. Returns QueryResult. |
client.memories.get_audit(memory_id) |
Full provenance + extraction + history for one memory. |
Links
- Homepage: https://aurra.us
- Benchmarks: https://github.com/aurra-memory/benchmarks
- Issues: support@aurra.us
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file aurra-0.3.0.tar.gz.
File metadata
- Download URL: aurra-0.3.0.tar.gz
- Upload date:
- Size: 10.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.15
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
97d9ed5355b75c3601ab280cbb97308ecef8e482ee0183aae098eca284531bb8
|
|
| MD5 |
b723dc84d4bdfdf2e20a6d1238454533
|
|
| BLAKE2b-256 |
bd034993aa6df2d3fdacdcff4a29a289c1741b97182340f647dedaa6202bb08f
|
File details
Details for the file aurra-0.3.0-py3-none-any.whl.
File metadata
- Download URL: aurra-0.3.0-py3-none-any.whl
- Upload date:
- Size: 9.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.15
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
659d051816851c5fef8bba5587378e00e2e52f93c4efd24c1fd7993015752651
|
|
| MD5 |
f5cff0cb08fbc9dbfc53b26fcf4aaa16
|
|
| BLAKE2b-256 |
ddf07e77d7e4f25311cc41b3329661fad201c0cec67a16a59d6da54e10034bfe
|