LangChain integration for OneBrain — persistent AI memory layer
Project description
onebrain-langchain
LangChain integration for OneBrain — a persistent AI memory layer for humans and agents. This package provides chat message history, a retriever, and agent tools that connect LangChain to OneBrain via the official Python SDK.
Architecture: This package wraps onebrain-sdk (not raw HTTP). If OneBrain's API changes, only the SDK needs updating — this package stays stable.
Installation
pip install onebrain-langchain
Requires Python 3.9+.
Dependencies:
langchain-core>=0.3,<1onebrain-sdk>=1.0,<2
Configuration
Set your OneBrain API key as an environment variable or pass it directly:
export ONEBRAIN_API_KEY="ob_your_prefix:your_secret_here"
All components accept api_key and base_url parameters to override the defaults.
| Environment Variable | Description | Default |
|---|---|---|
ONEBRAIN_API_KEY |
Your OneBrain API key | — (required) |
ONEBRAIN_BASE_URL |
Base URL for self-hosted instances | https://onebrain.rocks/api/eu |
Getting an API Key
- Sign in at onebrain.rocks/dashboard.
- Navigate to Settings > API Keys.
- Click Create API Key and copy the full key (
ob_prefix:secret).
Quick Start
Chat Message History
Store and retrieve conversation messages as OneBrain memories:
from langchain_onebrain import OneBrainChatMessageHistory
from langchain_core.messages import HumanMessage, AIMessage
# Create a history backed by OneBrain
history = OneBrainChatMessageHistory(
api_key="ob_xxx:secret",
session_id="session-abc-123",
)
# Add messages
history.add_user_message("What is the weather like?")
history.add_ai_message("I don't have real-time weather data.")
# Retrieve all messages
for msg in history.messages:
print(f"{msg.type}: {msg.content}")
# Clear (archives messages, does not delete)
history.clear()
With ConversationChain
from langchain.chains import ConversationChain
from langchain_openai import ChatOpenAI
from langchain_onebrain import OneBrainChatMessageHistory
llm = ChatOpenAI(model="gpt-4")
memory_history = OneBrainChatMessageHistory(session_id="user-42")
chain = ConversationChain(
llm=llm,
memory=ConversationBufferMemory(
chat_memory=memory_history,
return_messages=True,
),
)
response = chain.predict(input="Tell me about quantum computing.")
Retriever
Search OneBrain memories and return them as LangChain documents:
from langchain_onebrain import OneBrainRetriever
retriever = OneBrainRetriever(
api_key="ob_xxx:secret",
top_k=5,
search_mode="hybrid", # "keyword", "vector", or "hybrid"
)
# Use directly
docs = retriever.invoke("user preferences for dark mode")
for doc in docs:
print(doc.page_content)
print(doc.metadata) # id, type, confidence, score, source_type, created_at
With RetrievalQA
from langchain.chains import RetrievalQA
from langchain_openai import ChatOpenAI
from langchain_onebrain import OneBrainRetriever
llm = ChatOpenAI(model="gpt-4")
retriever = OneBrainRetriever(top_k=10, search_mode="hybrid")
qa = RetrievalQA.from_chain_type(
llm=llm,
retriever=retriever,
chain_type="stuff",
)
answer = qa.invoke("What are the user's coding preferences?")
print(answer["result"])
Agent Tools
Four tools for use with LangChain agents:
from langchain_onebrain import (
OneBrainSearchTool,
OneBrainWriteTool,
OneBrainContextTool,
OneBrainEntityTool,
)
# Create tools
search = OneBrainSearchTool(api_key="ob_xxx:secret")
write = OneBrainWriteTool(api_key="ob_xxx:secret")
context = OneBrainContextTool(api_key="ob_xxx:secret")
entity = OneBrainEntityTool(api_key="ob_xxx:secret")
# Use with an agent
from langchain_openai import ChatOpenAI
from langchain.agents import AgentExecutor, create_openai_tools_agent
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
llm = ChatOpenAI(model="gpt-4")
tools = [search, write, context, entity]
prompt = ChatPromptTemplate.from_messages([
("system", "You are a helpful assistant with access to the user's memory."),
MessagesPlaceholder("chat_history", optional=True),
("human", "{input}"),
MessagesPlaceholder("agent_scratchpad"),
])
agent = create_openai_tools_agent(llm, tools, prompt)
executor = AgentExecutor(agent=agent, tools=tools)
result = executor.invoke({"input": "What do you know about my work projects?"})
print(result["output"])
Tool Reference
| Tool | Name | Description |
|---|---|---|
OneBrainSearchTool |
onebrain_search |
Search memories with keyword/vector/hybrid modes |
OneBrainWriteTool |
onebrain_write |
Store new memories (facts, preferences, goals, etc.) |
OneBrainContextTool |
onebrain_context |
Get optimized user context (brief/assistant/project/deep) |
OneBrainEntityTool |
onebrain_entity |
List entities (people, organizations, tools) |
Self-Hosted Setup
OneBrain supports self-hosted deployments. Point the SDK to your instance:
from langchain_onebrain import OneBrainRetriever
retriever = OneBrainRetriever(
api_key="ob_xxx:secret",
base_url="https://brain.your-company.com/api/v1",
)
Or set the environment variable:
export ONEBRAIN_BASE_URL="https://brain.your-company.com/api/v1"
API Reference
OneBrainChatMessageHistory
| Parameter | Type | Default | Description |
|---|---|---|---|
api_key |
str | None |
None |
OneBrain API key (or ONEBRAIN_API_KEY env var) |
session_id |
str |
"default" |
Unique session identifier |
base_url |
str | None |
None |
API base URL override |
Methods:
messages— Property returningList[BaseMessage]add_message(message)— Store a single messageadd_messages(messages)— Store multiple messagesclear()— Archive all session messages
OneBrainRetriever
| Parameter | Type | Default | Description |
|---|---|---|---|
api_key |
str | None |
None |
OneBrain API key |
top_k |
int |
10 |
Maximum results |
search_mode |
str |
"hybrid" |
Search mode: keyword/vector/hybrid |
base_url |
str | None |
None |
API base URL override |
alpha |
float | None |
None |
Keyword/vector balance (0.0-1.0) |
Methods:
invoke(query)/_get_relevant_documents(query)— ReturnsList[Document]
OneBrainSearchTool
| Parameter | Type | Default | Description |
|---|---|---|---|
query |
str |
— | Search query (required) |
top_k |
int |
10 |
Maximum results |
mode |
str |
"hybrid" |
Search mode |
OneBrainWriteTool
| Parameter | Type | Default | Description |
|---|---|---|---|
title |
str |
— | Memory title (required) |
body |
str |
— | Memory content (required) |
memory_type |
str |
"fact" |
Type: fact/preference/decision/goal/experience/skill |
OneBrainContextTool
| Parameter | Type | Default | Description |
|---|---|---|---|
scope |
str |
"deep" |
Scope: brief/assistant/project/deep |
OneBrainEntityTool
| Parameter | Type | Default | Description |
|---|---|---|---|
entity_type |
str | None |
None |
Filter by entity type |
limit |
int |
20 |
Maximum results |
Development
git clone https://github.com/azappnew/onebrain-langchain.git
cd onebrain-langchain
python -m venv .venv
source .venv/bin/activate
pip install -e ".[dev]"
# Run tests
pytest tests/ -v
# With coverage
pytest tests/ -v --cov=langchain_onebrain --cov-report=term-missing
# Lint
ruff check src/ tests/
# Type check
mypy src/
License
MIT License. See LICENSE for details.
Copyright (c) 2026 AZapp One.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file onebrain_langchain-0.1.0.tar.gz.
File metadata
- Download URL: onebrain_langchain-0.1.0.tar.gz
- Upload date:
- Size: 16.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.9.6
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
eebb8f32452d4a15bea2780a37292340ac73ffc8ed0d8a8143e026aaaef950ed
|
|
| MD5 |
36b69e33563e1b0c508deb867feea053
|
|
| BLAKE2b-256 |
56ce230a8acf12b6f68066fec9776f14116e7b840023da31ab56d9a70a42534d
|
File details
Details for the file onebrain_langchain-0.1.0-py3-none-any.whl.
File metadata
- Download URL: onebrain_langchain-0.1.0-py3-none-any.whl
- Upload date:
- Size: 12.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.9.6
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
71ca9dd2ca794cba7a8ae987d703a4ee8dcd67fc054a5edd57712e15b4a4b23e
|
|
| MD5 |
dd49af4b8d971720506bba90aebdcb3e
|
|
| BLAKE2b-256 |
990a0df8471a62bd7caf690331176d689887dddf254bce06765bef3409bbef29
|