CoralBricks + LangChain integration: retriever, chat history, memory helper, and agent tools.
Project description
coralbricks-langchain
Use CoralBricks as the memory backend for your LangChain applications: retriever, chat history, agent memory tools, and RAG middleware — all backed by the CoralBricks Memory API.
- Drop-in
CoralBricksRetrieverfor any LCEL chain or RAG pipeline. - Three agent tools (
store,search,forget) for persistent memory across turns. CoralBricksChatMessageHistory— persistent chat history backed by the CoralBricks chat API.- Memory stores — each store gets a dedicated index. Share memory across agents via the same store name.
Installation
pip install coralbricks-langchain
Requires Python 3.10+ and LangChain >= 1.0.
API key
Get a CoralBricks API key from the CoralBricks web app.
Quick start
from coralbricks_langchain import CoralBricksMemory
memory = CoralBricksMemory(api_key="your_coralbricks_api_key")
memory.get_or_create_memory_store("langchain:my-app")
memory.set_session_id("user-123")
# Save a memory
mem_id = memory.save_memory("Pro plan costs $199/month with unlimited ops.")
# Search by meaning
hits = memory.search_memory("What does the Pro plan cost?", top_k=3)
for h in hits:
print(h.get("score"), h.get("text"))
# Forget by meaning
memory.forget_memory("Pro plan pricing")
Agent memory tools
Give your agent tools to store, search, and forget memories across turns.
Pass the memory instance directly to get_tools() — no global state required.
from coralbricks_langchain import CoralBricksMemory, get_tools
from langchain_openai import ChatOpenAI
memory = CoralBricksMemory(api_key="your_coralbricks_api_key")
memory.get_or_create_memory_store("langchain:support-agent")
memory.set_session_id("user-123")
tools = get_tools(memory)
model = ChatOpenAI(model="gpt-4o-mini", temperature=0)
agent = create_agent(
model,
tools=tools,
system_prompt=(
"You are a helpful assistant with persistent memory (CoralBricks). "
"Before answering, always search memory for relevant context. "
"When you learn something important, store it."
),
)
result = agent.invoke({"messages": [{"role": "user", "content": "Remember: Alex is on the Enterprise plan."}]})
print(result["messages"][-1].content)
Retriever (LCEL chains)
CoralBricksRetriever implements BaseRetriever and drops into any LCEL chain:
from coralbricks_langchain import CoralBricksMemory, CoralBricksRetriever
memory = CoralBricksMemory(api_key="your_coralbricks_api_key")
memory.get_or_create_memory_store("langchain:my-kb")
retriever = CoralBricksRetriever(memory=memory, top_k=5)
docs = retriever.invoke("What is the cancellation policy?")
for doc in docs:
print(doc.page_content)
Chat message history
Persistent chat history backed by the CoralBricks chat API:
from coralbricks_langchain import CoralBricksMemory, CoralBricksChatMessageHistory
memory = CoralBricksMemory(api_key="your_coralbricks_api_key")
history = CoralBricksChatMessageHistory(client=memory.client, conversation_id="conv-001")
history.add_user_message("Hello!")
history.add_ai_message("Hi, how can I help?")
for msg in history.messages:
print(msg.type, msg.content)
API reference
CoralBricksMemory
| Method | Description |
|---|---|
CoralBricksMemory(api_key, base_url?) |
Create memory instance (client created internally) |
.get_or_create_memory_store(name) |
Attach to or create a dedicated memory store (idempotent) |
.create_memory_store(name) |
Create a new store (raises if exists) |
.set_project_id(id) |
Set project namespace |
.set_session_id(id) |
Set session/user namespace |
.save_memory(text, metadata?) |
Embed and store a memory item |
.search_memory(query, top_k=5) |
Semantic search over memories |
.forget_memory(query, top_k=5) |
Forget memories matching a semantic query |
Other components
| Symbol | Description |
|---|---|
CoralBricksRetriever |
LangChain BaseRetriever for LCEL RAG pipelines |
CoralBricksChatMessageHistory |
LangChain BaseChatMessageHistory backed by CoralBricks chat storage |
get_tools(memory) |
Factory returning [store, search, forget] tools bound to a memory instance |
Conventions
| Field | Example | Purpose |
|---|---|---|
store_name |
langchain:support-agent |
Dedicated index for this app/use-case |
session_id |
user-123, conv-001 |
Conversation or user scope |
metadata |
{"source": "policy"} |
Optional metadata stored with each item |
License
Apache-2.0.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file coralbricks_langchain-0.3.0.tar.gz.
File metadata
- Download URL: coralbricks_langchain-0.3.0.tar.gz
- Upload date:
- Size: 10.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
811eec853d47d1efe5cd7ce4cdb177998fc9eb8dad05aea9e810b7bed01e2c23
|
|
| MD5 |
163d225db5048b303204fa77aea7dc47
|
|
| BLAKE2b-256 |
2e8b4e8b49edc425ef10f4a612a402556fec5a2375c53c51e7030ea6902abddd
|
File details
Details for the file coralbricks_langchain-0.3.0-py3-none-any.whl.
File metadata
- Download URL: coralbricks_langchain-0.3.0-py3-none-any.whl
- Upload date:
- Size: 11.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
008817f253204537e45dc6584f029949fd8127fcf753e14e768aa0825611252c
|
|
| MD5 |
534a09cc1a73a55a0f199591cabcf0ee
|
|
| BLAKE2b-256 |
92904546890dfdc6675991af18430bb9ae474d6c2010343307d333c6c1cf9049
|