LangChain integration for Aurra — memory infrastructure with citations, audit trails, and bi-temporal versioning
Project description
aurra-langchain
LangChain integration for Aurra - memory infrastructure for AI agents.
Drop-in BaseChatMessageHistory that gives your LangChain agents:
- Bi-temporal versioning - memories know what was true, when, and who superseded them
- Citation-grounded retrieval - every retrieved fact carries its source
- Multi-tenant isolation - scope memories per-user without index sprawl
- Auto-supersession - Aurra's classifier detects when a new fact replaces an old one
Targets LangChain 1.0+ (uses the RunnableWithMessageHistory pattern).
Installation
pip install aurra-langchain
Usage
from aurra_langchain import AurraChatMessageHistory
from langchain_core.runnables.history import RunnableWithMessageHistory
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
from langchain_anthropic import ChatAnthropic
# Wire up history backed by Aurra
def get_session_history(session_id: str):
return AurraChatMessageHistory(
api_key="aurra_...",
session_id=session_id,
tenant_id="acme-corp", # optional
)
prompt = ChatPromptTemplate.from_messages([
("system", "You are a helpful assistant with memory."),
MessagesPlaceholder("history"),
("human", "{input}"),
])
chain = prompt | ChatAnthropic(model="claude-haiku-4-5-20251001")
with_history = RunnableWithMessageHistory(
chain,
get_session_history,
input_messages_key="input",
history_messages_key="history",
)
# Each turn writes to Aurra automatically
with_history.invoke(
{"input": "My favorite coffee is Stumptown."},
config={"configurable": {"session_id": "user-42-conv-7"}},
)
Direct API
You can also use it standalone, without RunnableWithMessageHistory:
history = AurraChatMessageHistory(
api_key="aurra_...",
session_id="user-42-conv-7",
)
history.add_user_message("My favorite coffee is Stumptown.")
history.add_ai_message("Got it.")
print(history.messages) # list of messages, oldest first
Configuration
| Param | Type | Default | Description |
|---|---|---|---|
api_key |
str | required | Your Aurra API key (from app.aurra.us) |
session_id |
str | required | Conversation identifier (groups extracted memories) |
tenant_id |
str | None | Multi-tenant scope (optional) |
base_url |
str | https://api.aurra.us | Override for self-hosted/staging |
auto_supersede |
bool | None | Per-key default if None |
max_messages_returned |
int | 50 | Cap on history.messages length |
What gets stored
Each add_user_message / add_ai_message call buffers the turn. When both
sides of an exchange are present, the pair is sent to Aurra's
/agent/memories endpoint in messages mode. Aurra's extractor LLM atomizes
the exchange into individual factual memories.
The messages getter pulls memories back as HumanMessage / AIMessage
objects in chronological order, reconstructed from each memory's
original_message field.
Lossiness note
Aurra is a fact store, not a verbatim message store. Round-tripping messages
is approximate: original_message is preserved up to 500 chars per turn, and
extracted memories may have refined wording. If you need 100% faithful
conversation replay, use a different LangChain backend.
Limitations (v0.1.0)
- Streaming not yet supported (PRs welcome)
- Tool integration deferred to v0.2.0
- Retriever class (separate from history) deferred to v0.2.0
- The legacy
BaseMemorypattern (LangChain 0.x) is not supported - useRunnableWithMessageHistorywithAurraChatMessageHistoryinstead
License
MIT.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file aurra_langchain-0.1.0.tar.gz.
File metadata
- Download URL: aurra_langchain-0.1.0.tar.gz
- Upload date:
- Size: 8.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.15
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
5a75defdbdd914e67cb8d62767a61063fe79146a54c1acbbd4ab8746d9b1cad3
|
|
| MD5 |
7747e6d2fa19c9d85132b07c82a22137
|
|
| BLAKE2b-256 |
e45c2fbac3d676222088060dc92ee89da10ef17cfbf2eea51819fdc1d4197615
|
File details
Details for the file aurra_langchain-0.1.0-py3-none-any.whl.
File metadata
- Download URL: aurra_langchain-0.1.0-py3-none-any.whl
- Upload date:
- Size: 7.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.15
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
1a094adaf35b9046fe69e3fc7269dde4dd303ec6ab6c161ea6d2cf45f0ae24ae
|
|
| MD5 |
ea3434d4b6c78f064f48a8d105a3f432
|
|
| BLAKE2b-256 |
023247c4961c8094582fcc2f1a9aa93d725007922054caa53734d320c5e4c9a6
|