Skip to main content

CrewAI tools with Endee vector database support

Project description

EndeeCrewAI Integration

High-Performance Vector Database Integration for CrewAI Agents

EndeeCrewAI provides a seamless integration between Endee — a high-performance vector database — and CrewAI. It allows CrewAI agents to store, retrieve, and manage vector-based knowledge efficiently with metadata support, enabling advanced search and contextual reasoning.


Features

  • Vector-based memory for CrewAI: Use Endee as a backend for short-term and entity memory.
  • High-performance search: Approximate Nearest Neighbor (ANN) searches for fast retrieval.
  • Metadata & filtering support: Store rich metadata and filter queries.
  • Embeddings integration: Supports any embedding provider (e.g., Google Gemini, OpenAI).

Installation

pip install endee-crewai

Ensure you also have the required dependencies for CrewAI and your chosen embedding provider:

pip install crewai crewai-tools google-genai

Environment Variables

Create a .env file to store API credentials:

ENDEE_API_TOKEN=your_endee_api_token
GOOGLE_API_KEY=your_google_api_key
OPENAI_API_KEY=your_openai_api_key
COHERE_API_KEY=your_cohere_api_key

Note: You can use any embedding provider (OpenAI, Google, Cohere, HuggingFace). Supply the API key for the provider you use and omit the rest.

However, ENDEE_API_TOKEN is required, as it is needed to access your Endee vector database.


Quick Start

Initialize Endee Vector Store

from endee_crewai import EndeeVectorStore

# Embedding function (e.g., using OpenAI or Google or Cohere)
embedder_config = {
    "provider": "cohere",
    "config": {"model_name": "small", "api_key": "<COHERE_API_KEY>"}
}

# Create Endee store
memory_store = EndeeVectorStore(
    type="my_index",
    api_token="<ENDEE_API_TOKEN>",
    embedder_config=embedder_config,
    space_type="cosine",
    crew=None,
)

# Reset index if needed
memory_store.reset()
time.sleep(2) # Wait for reset

Save Documents

documents = [
    ("Python is dynamically typed.", {"creator": "Guido van Rossum", "typing": "dynamic"}),
    ("Go is statically typed.", {"creator": "Robert Griesemer", "typing": "static"})
]

for text, meta in documents:
    memory_store.save(text, meta)

Query Vector Store

results = memory_store.search(
    query="Python typing discipline",
    limit=3,
    filter={"typing": {"$eq": "dynamic"}}
)

for r in results:
    print(f"ID: {r['id']}, Score: {r['score']}, Text: {r['context']}")

CrewAI Integration

from crewai.memory import ShortTermMemory, EntityMemory
from crewai import Agent, Crew, Task, Process, LLM

# Create CrewAI memory objects
short_memory = ShortTermMemory(storage=memory_store)
entity_memory = EntityMemory(storage=memory_store)

# Define LLM (Use any valid model)
llm = LLM(model="gemini-2.5-flash-lite", api_key="<GOOGLE_API_KEY>")

# Define an agent
agent = Agent(
    role="Programming Language Expert",
    goal="Answer questions using stored programming language knowledge.",
    backstory="Consult documents and provide concise answers.",
    llm=llm,
    memory=short_memory,
    verbose=True
)

# Define a task
task = Task(
    description="Answer questions about programming languages.",
    agent=agent,
    expected_output="Concise and accurate answers."
)

# Run Crew
crew = Crew(
    agents=[agent],
    tasks=[task],
    process=Process.sequential,
    memory=True,
    short_term_memory=short_memory,
    entity_memory=entity_memory,
    verbose=True
)

result = crew.kickoff()
print(result)

Helper Functions

You can define custom helpers for:

  • Checking vector entries in Endee
  • Testing knowledge retrieval
  • Validating entity memory

Example:

def check_language_vectors(memory_store):
    queries = ["Python language features", "Go concurrency"]
    for q in queries:
        results = memory_store.search(query=q, limit=3)
        for r in results:
            print(r["context"])

Configuration Options

The EndeeVectorStore constructor accepts the following parameters:

  • type: Name of the Endee index.
  • api_token: Your Endee API token (required if endee_index is not provided).
  • embedder_config: Configuration for the embedding provider (OpenAI, Google, Cohere, HuggingFace).
  • space_type: Distance metric for vector search. One of "cosine", "l2", or "ip" (default: "cosine").
  • allow_reset: Whether to allow resetting (deleting) the index (default: True).
  • crew: Optional CrewAI object for integration.
  • text_key: Key to store text in metadata (default: "value").
  • endee_index: Optional existing Endee index object.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

endee_crewai-0.1.0.tar.gz (7.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

endee_crewai-0.1.0-py3-none-any.whl (5.4 kB view details)

Uploaded Python 3

File details

Details for the file endee_crewai-0.1.0.tar.gz.

File metadata

  • Download URL: endee_crewai-0.1.0.tar.gz
  • Upload date:
  • Size: 7.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.9

File hashes

Hashes for endee_crewai-0.1.0.tar.gz
Algorithm Hash digest
SHA256 fec10db5810b2a83d5966bc56bfc1323172fc41f461096df1ee3bd353e0900ff
MD5 7e606819c93ecff0bf549ebe4aad4e91
BLAKE2b-256 012b69f778381e1cb3e873670cdd5d919236ddfc6a20c95f7f96d0b21151d3be

See more details on using hashes here.

File details

Details for the file endee_crewai-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: endee_crewai-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 5.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.9

File hashes

Hashes for endee_crewai-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 9a736ea00acb2b3b364865a97688d8dd38f2e121fc4b70994dad17d8a38a36fb
MD5 82baf585a43d25a9e352d7311b6581ec
BLAKE2b-256 2d9c1f547420f7808238f169e3ef072d8b413530b5ac62ca06f9854fc99e39e5

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page