Skip to main content

CrewAI integration for the Dakera AI memory platform

Project description

crewai-dakera

CI PyPI Python License: MIT

Persistent, semantically-recalled memory for CrewAI agents, powered by Dakera.

Your CrewAI crews remember everything — across sessions, across restarts. Dakera handles embedding, storage, and retrieval server-side with no local model required.


Quick Start

Step 1 — Run Dakera

Dakera is a self-hosted memory server. Spin it up with Docker:

docker run -d \
  --name dakera \
  -p 3300:3300 \
  -e DAKERA_ROOT_API_KEY=dk-mykey \
  ghcr.io/dakera-ai/dakera:latest

For a production setup with persistent storage, use Docker Compose:

# Download and start
curl -sSfL https://raw.githubusercontent.com/Dakera-AI/dakera-deploy/main/docker-compose.yml \
  -o docker-compose.yml
DAKERA_API_KEY=dk-mykey docker compose up -d

# Verify it's running
curl http://localhost:3300/health

Full deployment guide: github.com/Dakera-AI/dakera-deploy

Step 2 — Install the integration

pip install crewai-dakera

Step 3 — Add memory to your crew

from crewai import Crew, Agent, Task
from crewai.memory import LongTermMemory
from crewai_dakera import DakeraStorage

storage = DakeraStorage(
    api_url="http://localhost:3300",
    api_key="dk-mykey",
    agent_id="my-crew",
)

crew = Crew(
    agents=[...],
    tasks=[...],
    memory=True,
    long_term_memory=LongTermMemory(storage=storage),
)

result = crew.kickoff(inputs={"topic": "AI trends"})

Your crew now persists everything it learns across runs.


Installation

# Core + integration
pip install crewai-dakera

# With CrewAI (if not already installed)
pip install "crewai-dakera[crewai]"

Requirements: Python ≥ 3.10, a running Dakera server (see Step 1 above)


Configuration

Parameter Type Default Description
api_url str Dakera server URL (e.g. http://localhost:3300)
api_key str "" API key set via DAKERA_ROOT_API_KEY
agent_id str Logical identifier for this crew's memory
min_importance float 0.0 Minimum importance score for recalled memories
top_k int 5 Number of memories to surface per turn

Use environment variables to avoid hardcoding credentials:

import os
from crewai_dakera import DakeraStorage

storage = DakeraStorage(
    api_url=os.environ["DAKERA_URL"],
    api_key=os.environ["DAKERA_API_KEY"],
    agent_id="research-crew",
)

Examples

Research crew with persistent memory

from crewai import Agent, Task, Crew, Process
from crewai.memory import LongTermMemory, ShortTermMemory, EntityMemory
from crewai_dakera import DakeraStorage

dakera = DakeraStorage(
    api_url="http://localhost:3300",
    api_key="dk-mykey",
    agent_id="research-crew",
)

researcher = Agent(
    role="Senior Researcher",
    goal="Uncover groundbreaking insights in {topic}",
    backstory="An expert researcher with decades of experience.",
    verbose=True,
)

writer = Agent(
    role="Content Writer",
    goal="Craft compelling reports based on research findings",
    backstory="A skilled writer who turns complex ideas into clear prose.",
    verbose=True,
)

research_task = Task(
    description="Research the latest developments in {topic}",
    expected_output="A detailed research report",
    agent=researcher,
)

write_task = Task(
    description="Write a blog post based on the research",
    expected_output="A polished 500-word article",
    agent=writer,
)

crew = Crew(
    agents=[researcher, writer],
    tasks=[research_task, write_task],
    process=Process.sequential,
    memory=True,
    long_term_memory=LongTermMemory(storage=dakera),
    verbose=True,
)

# First run — learns and stores findings
result = crew.kickoff(inputs={"topic": "quantum computing"})
print(result.raw)

# Second run — recalls prior research automatically
result = crew.kickoff(inputs={"topic": "quantum computing advances"})
print(result.raw)

Custom importance scoring

storage = DakeraStorage(
    api_url="http://localhost:3300",
    api_key="dk-mykey",
    agent_id="my-crew",
    min_importance=0.6,  # only surface high-quality memories
    top_k=10,
)

How it works

  1. After each task, CrewAI calls DakeraStorage.save() with the result
  2. Dakera embeds the content server-side (no local model needed) and stores it with a semantic vector
  3. Before the next task, CrewAI calls DakeraStorage.search() — Dakera performs hybrid search (vector + BM25) and returns the most relevant past memories
  4. Memories decay gracefully over time based on access patterns — frequently-accessed memories stay prominent

Related packages

Package Framework Language
langchain-dakera LangChain Python
llamaindex-dakera LlamaIndex Python
autogen-dakera AutoGen Python
@dakera-ai/langchain LangChain.js TypeScript

Links


License

MIT © Dakera AI

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

crewai_dakera-0.1.0.tar.gz (5.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

crewai_dakera-0.1.0-py3-none-any.whl (4.8 kB view details)

Uploaded Python 3

File details

Details for the file crewai_dakera-0.1.0.tar.gz.

File metadata

  • Download URL: crewai_dakera-0.1.0.tar.gz
  • Upload date:
  • Size: 5.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for crewai_dakera-0.1.0.tar.gz
Algorithm Hash digest
SHA256 944ae66bfd1529d5b7b3c93a789d0f0edbc00d45760bc79ec98521ba99ccdf5b
MD5 9446380c17156d90aade9a23b83c3972
BLAKE2b-256 54100fd01de2428d8361687924767accfb6fe6ae424a1dabe333b319bb6162ad

See more details on using hashes here.

Provenance

The following attestation bundles were made for crewai_dakera-0.1.0.tar.gz:

Publisher: release.yml on Dakera-AI/dakera-crewai

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file crewai_dakera-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: crewai_dakera-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 4.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for crewai_dakera-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 95481b9c708e365f7e1ca643e4b0f27609ccd5ceded0f75fcbfbff13917c1820
MD5 e792d57a252d852534ed9a7b68ab0a11
BLAKE2b-256 147915631b7d2976c77f941bd2a2f10cfb8a1a90779d2299485c40ae5eb79e49

See more details on using hashes here.

Provenance

The following attestation bundles were made for crewai_dakera-0.1.0-py3-none-any.whl:

Publisher: release.yml on Dakera-AI/dakera-crewai

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page