Run persistent memory agents locally. Deploy in one command.
Project description
OpenCrew
Run persistent memory agents locally. Deploy in one command.
OpenCrew gives you a local runtime for building agents that actually remember — powered by Letta for persistent memory, with a CLI, Python SDK, and REST API out of the box.
CrewAI for one-shot tasks. OpenCrew for agents that actually remember.
Quickstart
pip install opencrew
opencrew init # guided: [1] full local stack (Docker + Letta) or [2] cloud API + workspace key
opencrew create my-agent
# Edit my-agent/agent.yaml with your persona and tools
opencrew run my-agent "What's the best way to load a CSV in Python?"
opencrew memory my-agent
First-time setup: opencrew init walks through everything the CLI needs — either a local stack (Docker, Postgres, Letta, local API, optional Anthropic/OpenAI for Letta) or remote (OpenCrew API URL + workspace API key, optional BYO Letta URL/token). Both can exist in ~/.opencrew/config.yaml under local: and remote:. The CLI/SDK uses remote when api_url + api_key are set there; otherwise local.
Non-interactive: opencrew init --local, opencrew init --cloud --api-url https://... --api-key ..., or env OPENCREW_API_URL / OPENCREW_API_KEY.
What You Get
- Persistent memory — agents remember across sessions, not just within them
- Per-client scoping — give each end-user their own memory-isolated agent instance
- Config-as-code — YAML agent configs, git-committable and reproducible
- Tool plugins — register custom Python/TypeScript tools or MCP servers
- Skills system — SKILL.md knowledge files agents consult on demand
- Run management — async runs with SSE streaming, full history, and logs
- Python SDK — programmatic access to everything the CLI does
- REST API — FastAPI server with OpenAPI docs at
/docs
Prerequisites
- Python 3.11+
- Docker (Docker Desktop or Docker Engine)
- Node.js 20+ (includes npm)
- LLM API key (Anthropic or OpenAI)
- Letta server reachable at
LETTA_BASE_URL(see docs/README.md) — this is separate from OpenCrew’s catalog Postgres (agent types, runs, API keys).
Installation
pip install opencrew
This installs both the opencrew CLI and the from opencrew import OpenCrew Python SDK.
Setup
opencrew init
This command:
- Checks that Docker and Node.js are installed
- Prompts for your LLM API key (or reads from
ANTHROPIC_API_KEYenv var) - Starts Postgres and Letta server via Docker Compose
- Runs database migrations
- Installs the Letta Code SDK
- Generates your API key
- Starts the OpenCrew API server
Everything runs on your local machine. No data leaves your machine except LLM API calls.
CLI Reference
Lifecycle
opencrew init # Set up everything (Docker, DB, Letta, API server)
opencrew start # Start services (if stopped)
opencrew stop # Stop services
opencrew destroy # Remove all data and infrastructure
opencrew doctor # Check that everything is healthy
Agents
opencrew create my-agent # Scaffold agent config YAML
opencrew apply my-agent/agent.yaml # Sync config to server
opencrew list # List all agent types
Running
opencrew run my-agent "Hello!" # Run and stream response
opencrew run my-agent "Hello!" --client=user123 # Per-client memory isolation
opencrew run my-agent "Hello!" --verbose # Show reasoning + tool calls
opencrew run my-agent "Hello!" --fresh # Throwaway agent (no memory)
Memory
opencrew memory my-agent # Show what the agent remembers
opencrew memory my-agent --client=user123 # Per-client memory
opencrew reset my-agent # Wipe memory and start fresh
History
opencrew logs # All recent runs
opencrew logs my-agent # Runs for a specific agent
opencrew logs --status=failed # Filter by status
Agent Config (YAML)
name: my-agent
slug: my-agent
description: A helpful coding assistant
model: anthropic/claude-sonnet-4-20250514
persona: |
You are a Python expert specialized in data science.
You prefer clean, well-documented code.
memory:
rules: "Always use type hints. Prefer pandas over raw loops."
project: "Building a REST API with FastAPI"
tools:
- name: run-python
slug: run-python
source_type: python
source_code: |
def run_python(code: str) -> str:
"""Execute Python code and return output."""
import subprocess
result = subprocess.run(["python3", "-c", code],
capture_output=True, text=True, timeout=30)
return result.stdout or result.stderr
mcp_servers:
- name: filesystem
slug: filesystem
server_url: https://your-mcp-server.example.com/sse
skills:
- slug: python-best-practices
path: ./skills/python-best-practices/SKILL.md
Python SDK
from opencrew import OpenCrew
client = OpenCrew(api_key="<your-workspace-api-key>")
# Create an agent type
client.agent_types.create(
name="my-agent",
slug="my-agent",
persona="You are a Python expert.",
model="anthropic/claude-sonnet-4-20250514",
memory_defaults={"rules": "Always use type hints."},
)
# Trigger a run and stream results
run = client.runs.create(agent_type="my-agent", message="What's the best way to load a CSV?")
for event in client.runs.stream(run["id"]):
if event.type == "assistant":
print(event.content, end="")
# Check memory
for block in client.instances.memory("my-agent"):
print(f"{block['label']}: {block['value']}")
Async
from opencrew import AsyncOpenCrew
client = AsyncOpenCrew(api_key="<your-workspace-api-key>")
run = await client.runs.create(agent_type="my-agent", message="hello")
async for event in client.runs.stream(run["id"]):
print(event.content, end="")
SDK-Only Mode
The SDK works with any OpenCrew server — local or remote:
# Local server (default after opencrew init)
client = OpenCrew(api_key="<your-workspace-api-key>", base_url="http://localhost:8741")
# Remote / cloud server
client = OpenCrew(api_key="<your-workspace-api-key>", base_url="https://api.example.com")
Architecture
Your Machine
├── CLI / Python SDK
│ └── HTTP → FastAPI server (localhost:8741)
│ └── ThreadPoolExecutor spawns subprocess
│ └── tsx runs TypeScript script
│ └── Letta Code SDK
│ └── HTTP → Letta Server (localhost:8283)
│ └── Letta's persistence (often Postgres)
All services run locally via Docker Compose:
- Postgres (pgvector) — may hold OpenCrew catalog tables (runs, keys, agent types) and/or be used by Letta depending on your compose file; treat control plane data and Letta agent state as logically separate for backups and migration.
- Letta Server — manages agent memory, conversations, and tool execution
- FastAPI Server — orchestrates runs, manages config, serves the API
The only external traffic is LLM API calls (Anthropic/OpenAI).
API Docs
After opencrew init, browse to http://localhost:8741/docs for the full OpenAPI documentation.
License
Apache-2.0
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file opencrew-0.1.0.tar.gz.
File metadata
- Download URL: opencrew-0.1.0.tar.gz
- Upload date:
- Size: 97.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9c7b1f5e12e9b51e3bebb0d18641815c1cdae80c5bc65eca091153f63f8c98f9
|
|
| MD5 |
1d5d129a0b58c6ea8c4a229d5af14b49
|
|
| BLAKE2b-256 |
0d6cb33a562a98e9920a74b7cffa42221f2dfb30092ad71554acab2d7a3c0b76
|
File details
Details for the file opencrew-0.1.0-py3-none-any.whl.
File metadata
- Download URL: opencrew-0.1.0-py3-none-any.whl
- Upload date:
- Size: 123.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
63dc72fa0ca65b56991ae2c0bddeeb862add54f3724e95d6e4634d89de4bbbad
|
|
| MD5 |
7d85445989e0f78ff3362d81d39412cb
|
|
| BLAKE2b-256 |
90b418ab4427ed4e66c80d583bae0c94aa3597bdbef68a44647da06da50a6291
|