The soul ecosystem for CrewAI: persistent memory, identity, database schema intelligence, and enterprise API integration.
Project description
crewai-soul 🧠
The soul ecosystem for CrewAI agents.
One package, full stack:
- Persistent Memory — Markdown-native, git-versionable, human-readable
- Hybrid Retrieval — RAG + RLM via soul-agent
- Database Intelligence — Auto-generated semantic layers via soul-schema
- Enterprise Ready — SoulMate API integration for production
Install
pip install crewai-soul
This automatically installs:
soul-agent— Hybrid RAG+RLM memorysoul-schema— Database semantic layer generator
Quick Start
Basic Memory
from crewai import Crew, Agent, Task
from crewai_soul import SoulMemory
# Create markdown-based memory with full RAG+RLM
memory = SoulMemory()
# Use it with your crew
crew = Crew(
agents=[researcher, writer],
tasks=[research_task, writing_task],
memory=memory,
)
result = crew.kickoff()
Your crew's memories are stored in MEMORY.md — human-readable, git-versionable, no database required.
Database Schema Intelligence
Give your agents understanding of database structure:
from crewai_soul import SchemaMemory
# Connect to any SQLAlchemy-compatible database
schema = SchemaMemory("postgresql://user:pass@host/db")
# Auto-generate semantic descriptions using LLM
schema.generate()
# Get context for natural language queries
context = schema.context_for("Show me revenue by region")
# Returns formatted markdown with relevant tables/columns
# Use in agent prompts
agent = Agent(
role="Data Analyst",
goal="Answer business questions with SQL",
backstory=f"You have access to this schema:\n{schema.to_markdown()}"
)
Enterprise: SoulMate API
For production deployments with multi-tenant isolation:
from crewai_soul import SoulMateClient
client = SoulMateClient(
api_key="your-key",
tenant_id="your-org"
)
# Store memories in the cloud
client.remember("Critical decision: We're going with PostgreSQL", scope="/project/alpha")
# Semantic search across all memories
results = client.recall("database decisions")
# Full RAG+RLM question answering
answer = client.ask("What database did we choose for Project Alpha?")
Why crewai-soul?
| Feature | CrewAI Built-in | crewai-soul |
|---|---|---|
| Storage | Vector database | Markdown files + optional vectors |
| Human-readable | ❌ | ✅ |
| Git-versionable | ❌ | ✅ |
| Database schema context | ❌ | ✅ (soul-schema) |
| Enterprise multi-tenant | ❌ | ✅ (SoulMate API) |
| RAG + RLM hybrid | ❌ | ✅ (soul-agent) |
| Infrastructure | Database server | None (or cloud API) |
API Reference
SoulMemory
from crewai_soul import SoulMemory
memory = SoulMemory(
soul_path="SOUL.md", # Agent identity
memory_path="MEMORY.md", # Memory log
provider="anthropic", # LLM provider for RAG
use_hybrid=True, # Enable RAG+RLM (default: True if soul-agent installed)
)
# Store a memory
memory.remember("We decided to use PostgreSQL.", scope="/decisions")
# Recall relevant memories
matches = memory.recall("What database did we choose?", limit=5)
for m in matches:
print(f"[{m.score:.2f}] {m.content}")
# Clear memories (optionally by scope)
memory.forget(scope="/decisions")
# Get stats
print(memory.info())
# View structure
print(memory.tree())
SchemaMemory
from crewai_soul import SchemaMemory
schema = SchemaMemory(
database_url="postgresql://...",
llm_provider="anthropic",
)
# Generate descriptions for all tables
schema.generate()
# Or specific tables only
schema.generate(tables=["customers", "orders"])
# Get single table description
info = schema.describe("customers")
# Generate context for a query
context = schema.context_for("revenue by region")
# Export in different formats
schema.save("schema.json", format="json")
schema.save("schema.yml", format="dbt")
schema.save("training.jsonl", format="vanna")
# Full markdown documentation
docs = schema.to_markdown()
SoulMateClient
from crewai_soul import SoulMateClient
client = SoulMateClient(
api_key="...", # or SOULMATE_API_KEY env var
base_url="...", # or SOULMATE_URL env var
tenant_id="...", # for multi-tenant isolation
)
# Store memories
client.remember("Important fact", scope="/project")
# Search
results = client.recall("important", limit=10)
# Full RAG+RLM Q&A
answer = client.ask("What do we know about...")
# Clear memories
client.forget(scope="/project")
# Get stats
info = client.info()
The Soul Ecosystem
crewai-soul brings together:
| Package | Purpose | PyPI |
|---|---|---|
| soul-agent | Persistent memory & identity | pip install soul-agent |
| soul-schema | Database semantic layers | pip install soul-schema |
| crewai-soul | CrewAI integration (this package) | pip install crewai-soul |
SoulMate API — Enterprise hosted service for production deployments.
Links
- soul.py — Core memory library
- soul-schema — Database documentation
- SoulMate — Enterprise offering
- CrewAI — Multi-agent framework
- The Menon Lab — Research & tools
License
MIT
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file crewai_soul-0.3.0.tar.gz.
File metadata
- Download URL: crewai_soul-0.3.0.tar.gz
- Upload date:
- Size: 16.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9fd7480ce320f19504947037abf5cf65ad1604bf998dcbbd8b8afd07d16dac00
|
|
| MD5 |
296ad6e9dd282635c63b6baeb05b423a
|
|
| BLAKE2b-256 |
e1056561a64686efa95dbd2202b0def462bef162f119b75aaaffee765da35662
|
File details
Details for the file crewai_soul-0.3.0-py3-none-any.whl.
File metadata
- Download URL: crewai_soul-0.3.0-py3-none-any.whl
- Upload date:
- Size: 13.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
93d34fdae72c213fe07e00dce16969532fff091c4df7d4d63b41bf36fe66cd08
|
|
| MD5 |
090dc048eb35a606d3abf81c2d40096d
|
|
| BLAKE2b-256 |
cf009d2fb9b9e8fcae629448305cc1fd1f2d298e5d9844371135359bb8c68f9f
|