SUBSTRATE cognitive memory integration for LangChain and LangGraph
Project description
langchain-substrate
SUBSTRATE cognitive memory integration for LangChain and LangGraph.
Use SUBSTRATE as a persistent memory store for LangGraph agents or as a retriever in LangChain RAG pipelines. SUBSTRATE provides causal memory, semantic search, knowledge graphs, emotion state, identity verification, and 61 cognitive capability layers.
Installation
pip install langchain-substrate
Or install from source:
cd integrations/langchain
pip install -e ".[dev]"
Quick Start
Environment Setup
import os
os.environ["SUBSTRATE_API_KEY"] = "sk-sub-..."
As a LangGraph Memory Store
Use SubstrateStore as the backing store for any LangGraph agent. This gives your agent persistent, semantically searchable memory across conversations.
from langchain_substrate import SubstrateStore, SubstrateClient
from langgraph.prebuilt import create_react_agent
from langchain_openai import ChatOpenAI
# Create the SUBSTRATE-backed store
client = SubstrateClient(api_key=os.environ["SUBSTRATE_API_KEY"])
store = SubstrateStore(client=client)
# Create a LangGraph agent with SUBSTRATE memory
model = ChatOpenAI(model="gpt-4o")
agent = create_react_agent(model, tools=[], store=store)
# The agent now persists state to SUBSTRATE
config = {"configurable": {"thread_id": "conversation-1"}}
response = agent.invoke(
{"messages": [{"role": "user", "content": "Remember that my favorite color is blue."}]},
config=config,
)
Store Operations
# Store a value
store.put(("user", "alice"), "preferences", {"theme": "dark", "language": "en"})
# Retrieve by key
item = store.get(("user", "alice"), "preferences")
print(item.value) # {"theme": "dark", "language": "en"}
# Semantic search across memory
results = store.search(("user", "alice"), query="color preferences", limit=5)
for item in results:
print(item.key, item.value)
# Delete is a no-op (SUBSTRATE memory is append-only)
store.delete(("user", "alice"), "preferences")
Multi-Tenant Isolation
# Use namespace_prefix for tenant isolation
store = SubstrateStore(
client=client,
namespace_prefix="myapp.prod",
)
# All operations are scoped under "myapp.prod.*"
store.put(("user", "bob"), "state", {"step": 3})
As a LangChain Retriever (RAG)
Use SubstrateRetriever in any LangChain RAG chain. It uses SUBSTRATE's hybrid search (semantic + keyword) to find relevant memories.
from langchain_substrate import SubstrateRetriever
from langchain_openai import ChatOpenAI
from langchain_core.prompts import ChatPromptTemplate
from langchain_core.runnables import RunnablePassthrough
from langchain_core.output_parsers import StrOutputParser
# Create the retriever
retriever = SubstrateRetriever(
api_key=os.environ["SUBSTRATE_API_KEY"],
top_k=5,
)
# Build a RAG chain
prompt = ChatPromptTemplate.from_template(
"Answer based on the following context:\n{context}\n\nQuestion: {question}"
)
model = ChatOpenAI(model="gpt-4o")
chain = (
{"context": retriever, "question": RunnablePassthrough()}
| prompt
| model
| StrOutputParser()
)
answer = chain.invoke("What are the entity's core values?")
Retriever with Namespace Scoping
retriever = SubstrateRetriever(
api_key=os.environ["SUBSTRATE_API_KEY"],
namespace="app.conversations",
top_k=10,
)
Free Tier Fallback
hybrid_search requires the Pro tier. On the free tier, fall back to memory_search:
retriever = SubstrateRetriever(
api_key=os.environ["SUBSTRATE_API_KEY"],
search_tool="memory_search",
)
Async Support
All operations support async for use in async LangGraph workflows:
import asyncio
from langchain_substrate import SubstrateStore, SubstrateClient
async def main():
client = SubstrateClient(api_key="sk-sub-...")
store = SubstrateStore(client=client)
await store.aput(("user", "alice"), "mood", {"current": "happy"})
item = await store.aget(("user", "alice"), "mood")
print(item.value)
results = await store.asearch(("user",), query="emotional state")
for r in results:
print(r.value)
asyncio.run(main())
Architecture
LangGraph Agent / RAG Chain
|
SubstrateStore / SubstrateRetriever
|
SubstrateClient (httpx)
|
SUBSTRATE MCP Server (JSON-RPC over HTTP)
|
Causal Memory + Knowledge Graph + 61 Layers
Namespace Encoding
LangGraph uses tuple namespaces like ("user", "alice", "prefs"). SUBSTRATE uses flat string keys. The store encodes namespaces as dot-separated prefixes:
| LangGraph Namespace | SUBSTRATE Prefix |
|---|---|
("user", "alice") |
user.alice |
("app", "v2", "state") |
app.v2.state |
Tool Mapping
| Store Operation | SUBSTRATE Tool | Tier |
|---|---|---|
put() |
respond |
Free |
get() |
memory_search |
Free |
search() |
hybrid_search |
Pro |
list_namespaces() |
N/A (limited) | -- |
delete() |
No-op | -- |
SUBSTRATE MCP Tools Available
| Tool | Description | Tier |
|---|---|---|
respond |
Send a message, get a response | Free |
memory_search |
Search causal memory episodes | Free |
hybrid_search |
Semantic + keyword search | Pro |
get_emotion_state |
Affective state vector | Free |
verify_identity |
Cryptographic identity check | Free |
knowledge_graph_query |
Query knowledge graph | Pro |
get_values |
Core value architecture | Free |
theory_of_mind |
User model | Free |
get_trust_state |
Trust scores | Pro |
Development
# Install dev dependencies
pip install -e ".[dev]"
# Run tests
pytest
# Run with coverage
pytest --cov=langchain_substrate --cov-report=term-missing
# Lint
ruff check src/ tests/
# Type check
mypy src/
License
MIT -- Garmo Labs
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file langchain_substrate-0.1.1.tar.gz.
File metadata
- Download URL: langchain_substrate-0.1.1.tar.gz
- Upload date:
- Size: 14.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a856f190ed7f218646230b04ddcec43b4b835ff139a68e07b833120b03a06f30
|
|
| MD5 |
17b1ef4adeb437df4d7b28601b4a9f50
|
|
| BLAKE2b-256 |
5cad4d55099329c57750541c9ef652c6a01706b03f6b48f0e2747058c096885b
|
File details
Details for the file langchain_substrate-0.1.1-py3-none-any.whl.
File metadata
- Download URL: langchain_substrate-0.1.1-py3-none-any.whl
- Upload date:
- Size: 11.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
3add17315c82f5a6d3280e1f8edb31960280e1213f8a5dd954f2ab41f024b4f1
|
|
| MD5 |
349a88c084475fa4c354bbe51794706f
|
|
| BLAKE2b-256 |
e2cea8cd4dbd17e2904a211adfc0fcd684ee1e9357bd6a65bbef4d9726cea1a8
|