Skip to main content

CoralBricks + CrewAI integration: client, memory helper, and tools.

Project description

coralbricks-crewai

Use CoralBricks as the memory backend for your CrewAI agents: store and search semantic memories over the Coralbricks Memory API.

  • Keep long‑lived knowledge across runs and sessions (policies, FAQs, user preferences, summaries).
  • Share memory across multiple crews/agents via project_id.
  • Keep using your existing LLM (OpenAI, etc.) – Coralbricks only replaces the memory/KB layer.

Installation

From PyPI:

pip install coralbricks-crewai

Requires Python 3.10+ and CrewAI.


API key and base URL

  • API key: Get a Coralbricks API key from the Coralbricks web app. Use it for all requests to the Memory API.
  • Base URL: Use the Coralbricks CrewAI Memory API: https://memory.coralbricks.ai

Environment variables (optional but convenient):

export CORALBRICKS_API_KEY="your_coralbricks_api_key"
export CORAL_MEMORY_BASE_URL="https://memory.coralbricks.ai"

Quick start: client and memory

This is the simplest "just store and search nuggets" flow (no CrewAI yet):

from coralbricks_crewai import CoralBricksMemory

memory = CoralBricksMemory(api_key="your_coralbricks_api_key")

# Create (or reconnect to) a named memory store
memory.create_memory_store("crewai:my-app")
memory.set_session_id("conv-123")

# Save a memory
memory.save_memory(
    "Cancellations within 24h of check-in incur a $50 fee.",
    metadata={"source": "policy_pdf"},
)

# Search by meaning
hits = memory.search_memory("What is the cancellation fee?", top_k=3)
for h in hits:
    print(h.get("score"), h.get("text"))

# Forget memories by meaning
memory.forget_memory("cancellation fee")

get_or_create_memory_store() is the idempotent alternative — safe to call on every startup:

memory = CoralBricksMemory(api_key="your_coralbricks_api_key")
memory.get_or_create_memory_store("crewai:my-app")

Before vs after: what Coralbricks adds

Without Coralbricks: A CrewAI travel agent gives a generic 2-day Tokyo itinerary—same for every user.

With Coralbricks: You store a memory (e.g. “Team prefers staying near Shibuya station, loves ramen, hates long queues”). The agent searches that memory and returns a personalized itinerary. You’ll see phrases that only appear because of memory, for example:

  • “Location near Shibuya matches your preference.”
  • “You can have ramen for breakfast.”
  • Short queues” or “avoid long waits.”

So: without Coral → generic answer; with Coral → same agent, but it recalls preferences and weaves them into the plan. The only extra pieces are: CoralBricksMemory and the SearchCoralBricksMemoryTool on the agent.


CrewAI tool: search Coralbricks memory

Give your agents a tool that searches Coralbricks memory by natural language:

from crewai import Agent, Task, Crew, Process
from langchain_openai import ChatOpenAI

from coralbricks_crewai import CoralBricksMemory, SearchCoralBricksMemoryTool

# 1. LLM (CrewAI still uses your LLM; Coralbricks only handles memory)
llm = ChatOpenAI(model="gpt-4o-mini")

# 2. Coralbricks memory — create (or reconnect to) a dedicated store
memory = CoralBricksMemory(api_key="your_coralbricks_api_key")
memory.get_or_create_memory_store("crewai:my-app")
memory.set_session_id("conv-123")

# 3. Create the search tool with the memory instance
memory_search_tool = SearchCoralBricksMemoryTool(memory=memory)

# Optionally seed some knowledge
memory.save_memory("Support hours: Mon–Fri 9am–6pm EST. Emergency line 24/7.")
memory.save_memory("Refund policy: full refund within 30 days; then prorated.")

# 4. Agent that can use the Coralbricks search tool
agent = Agent(
    role="Support assistant",
    goal="Answer user questions using stored policies and FAQs.",
    backstory="You search Coralbricks memory for relevant nuggets before answering.",
    tools=[memory_search_tool],
    llm=llm,
)

task = Task(
    description=(
        "The user asks: 'What are your support hours and what is your refund policy?'\n"
        "1. Use the Coralbricks memory search tool to find support hours and refund policy.\n"
        "2. Answer in 2–3 short sentences, based only on what you found."
    ),
    expected_output="A short answer that cites support hours and refund policy from memory.",
    agent=agent,
)

crew = Crew(agents=[agent], tasks=[task], process=Process.sequential)

if __name__ == "__main__":
    result = crew.kickoff()
    print(result)

The SearchCoralBricksMemoryTool receives its CoralBricksMemory instance directly via the constructor — no global state needed.


2‑hop example (hotel → cancellation policy)

This pattern shows Coralbricks being used as long‑term memory across steps:

  1. Hop 1 – user: "I want to book a hotel in Tokyo for 2 nights."
    Agent chooses a specific hotel and booking ref, then calls a save tool that does:

    memory.save_memory("Selected hotel: Hotel Tokyo Plaza, ref XYZ123, check-in March 15.")
    
  2. Hop 2 – user: "What's your cancellation policy?"
    Agent calls search_coralbricks_memory("cancellation policy") to get the policy nugget, and also searches for the saved booking.
    Answer might be:

    "For your booking at Hotel Tokyo Plaza (ref XYZ123, March 15), cancellations within 24 hours incur a $50 fee; earlier cancellations are fully refunded."


API reference

Method Description
CoralBricksMemory(api_key) Create a memory instance.
create_memory_store(name) Create a new dedicated store (TurboPuffer index). Raises if it already exists.
get_or_create_memory_store(name) Idempotent — attach to existing store or create it. Safe on every startup.
set_session_id(id) Scope operations to a conversation or user.
save_memory(text, metadata=None) Store a memory. Returns the memory id.
search_memory(query, top_k=5) Search by meaning. Returns list of {text, score, ...}.
forget_memory(query, top_k=5) Forget the closest memories matching a query.
SearchCoralBricksMemoryTool(memory=...) CrewAI tool that agents can call to search memory.

Conventions

Field Example Purpose
store_name crewai:hotel-support Dedicated memory store (own index).
session_id conv-123, user-42 Conversation or user scope.
metadata {"source": "policy"} Optional metadata stored with the item.

Multiple crews can share the same memory by using the same store_name (and possibly different session_ids).


How this changes CrewAI behavior

Without Coralbricks:

  • Agents are mostly stateless across runs, or rely on ad‑hoc local storage.
  • There is no easy, shared, semantic memory across crews.

With Coralbricks:

  • You get a remote semantic memory with its own dedicated index:
    • get_or_create_memory_store → create or reconnect to a store.
    • save_memory → store nuggets and doc chunks.
    • search_memory / SearchCoralBricksMemoryTool → retrieve by meaning.
    • forget_memory → remove memories by meaning.
  • Crews can share one memory store (same store_name).
  • You still bring your own LLM (OpenAI, etc.); Coralbricks only handles the memory/KB side.

License

Apache-2.0.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

coralbricks_crewai-0.2.0.tar.gz (9.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

coralbricks_crewai-0.2.0-py3-none-any.whl (8.0 kB view details)

Uploaded Python 3

File details

Details for the file coralbricks_crewai-0.2.0.tar.gz.

File metadata

  • Download URL: coralbricks_crewai-0.2.0.tar.gz
  • Upload date:
  • Size: 9.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.12

File hashes

Hashes for coralbricks_crewai-0.2.0.tar.gz
Algorithm Hash digest
SHA256 c5b62a9d2f443532919a36d05fa38acf15f6231b69b743b0c887862963704e1d
MD5 02e28920ba786fbb46b9129b443ce8af
BLAKE2b-256 e06edaee80089bdc9b898c9012af11e431e355166b05bb122831079cca82193f

See more details on using hashes here.

File details

Details for the file coralbricks_crewai-0.2.0-py3-none-any.whl.

File metadata

File hashes

Hashes for coralbricks_crewai-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 077c720f1c8e19e280c0ddcd26e30e1668f6e340bad43ad2a8a17d4bc8ac36eb
MD5 b79c5491e541575a98b2dfe5f72a2257
BLAKE2b-256 5f8bb2709bb6aded1b763898930888c1ad558d4e7ce3b28ef5cb12ca854b3bd0

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page