Raindrop integration for CrewAI multi-agent framework
Project description
raindrop-crewai
Raindrop integration for CrewAI (Python). Automatically captures crew kickoff invocations, multi-agent collaboration, task execution, and token usage by monkey-patching Crew.kickoff*.
Installation
pip install raindrop-crewai crewai
Quick Start
from raindrop_crewai import RaindropCrewAI
from crewai import Agent, Crew, Task
raindrop = RaindropCrewAI(
api_key="your-write-key",
user_id="user-123",
)
raindrop.setup() # auto-patch all Crew.kickoff* methods
agent = Agent(
role="Senior Researcher",
goal="Find one interesting fact about {topic}",
backstory="You are an experienced researcher.",
)
task = Task(
description="Identify one interesting fact about {topic}.",
expected_output="A single sentence fact.",
agent=agent,
)
crew = Crew(agents=[agent], tasks=[task])
result = crew.kickoff(inputs={"topic": "AI safety"})
print(result.raw)
raindrop.shutdown()
Factory function (alternative)
from raindrop_crewai import create_raindrop_crewai
raindrop = create_raindrop_crewai(api_key="your-write-key", user_id="user-123")
wrapped = raindrop.wrap(crew) # per-instance wrap (no global monkey-patch)
result = wrapped.kickoff(inputs={"topic": "AI safety"})
raindrop.shutdown()
What Gets Captured
- Crew kickoff invocations — input variables, crew name, process type (sequential/hierarchical)
- Agent metadata — agent roles and count
- Task metadata — task descriptions, count, and per-task summaries
- Token usage —
ai.usage.prompt_tokens,ai.usage.completion_tokens,ai.usage.cached_tokens,ai.usage.total_tokens - Model name — extracted from the first agent's
llm.model(when set) - Errors —
error.type/error.messageproperties; the original exception is always re-raised - Async support —
kickoff(),kickoff_async(),kickoff_for_each(),kickoff_for_each_async()are all instrumented - Nested OTel trace spans — crew workflow → agent → task → LLM, via the
opentelemetry-instrumentation-crewaiinstrumentor (see Tracing; per-tool spans are not produced — see Known Limitations)
Configuration patterns
Auto-patch all crews (recommended)
Monkey-patches the Crew class so every kickoff* call is traced:
from raindrop_crewai import setup_crewai
raindrop = setup_crewai(api_key="your-write-key", user_id="user-123")
# Every Crew.kickoff(...) is now traced automatically.
Wrap a specific crew
Per-instance wrap with no global side effects:
raindrop = RaindropCrewAI(api_key="your-write-key", user_id="user-123")
wrapped = raindrop.wrap(crew)
result = wrapped.kickoff(inputs={"topic": "..."})
Debug Mode
Enable verbose logging with debug=True:
raindrop = RaindropCrewAI(api_key="your-write-key", debug=True)
This sets the raindrop_crewai logger to DEBUG, surfacing telemetry-side failures that are otherwise swallowed (so the user's pipeline never crashes due to instrumentation).
Identify Users
Associate events with a user identity after initialization:
raindrop.identify("user-123", {"name": "Alice", "plan": "pro"})
Track Signals
Attach feedback, edits, or other custom signals to a previously-shipped event by its event_id. The Python SDK does not currently expose a lastEventId accessor like the TypeScript client does, so to attach a signal you need either (a) the event_id returned to your application out-of-band (e.g. logged by debug=True), or (b) the public ID surfaced on the dashboard's event detail page.
raindrop.track_signal(
event_id="<event-id-from-dashboard-or-debug-log>",
name="thumbs_up",
signal_type="feedback",
sentiment="POSITIVE",
comment="Great answer!",
)
Tracing
When tracing_enabled=True (the default), the integration activates the opentelemetry-instrumentation-crewai instrumentor via traceloop-sdk, producing nested OTel spans for every crew execution:
- Crew workflow — root span covering the entire
kickoff()call (crewai.workflow) - Agent execution — child span per agent (
{role}.agent) - Task execution — child span per task (
{name}.task) - LLM calls — leaf spans for each underlying LLM call, with model name and token usage
Trace spans land in the dashboard's Traces tab after async ingestion (typically 30–120s).
To ship flat events only without OTel spans:
raindrop = RaindropCrewAI(api_key="your-write-key", tracing_enabled=False)
Async Usage
import asyncio
async def main():
result = await crew.kickoff_async(inputs={"topic": "AI safety"})
print(result.raw)
results = await crew.kickoff_for_each_async(
inputs=[{"topic": "AI"}, {"topic": "ML"}]
)
asyncio.run(main())
Flushing and Shutdown
raindrop.flush() # flush pending data
raindrop.shutdown() # flush + release resources (call before process exit)
API Reference
RaindropCrewAI
| Parameter | Type | Default | Description |
|---|---|---|---|
api_key |
Optional[str] |
None |
Raindrop API key. If None, telemetry shipping is disabled (a UserWarning is issued). |
user_id |
Optional[str] |
None |
Default user identifier for all events. |
convo_id |
Optional[str] |
None |
Conversation/thread ID to group related events. |
tracing_enabled |
bool |
True |
Enable distributed tracing via the OTel CrewAI instrumentor. |
bypass_otel_for_tools |
bool |
True |
Forwarded to raindrop.init(). Controls whether tool spans on the underlying SDK skip the OTLP path. The CrewAI integration itself does not call interaction.track_tool(), so this flag has no observable effect on CrewAI events — exposed for SDK API parity. |
debug |
bool |
False |
Enable debug logging. |
Methods
| Method | Description |
|---|---|
setup() |
Monkey-patch Crew.kickoff* so every kickoff is traced (one-time, idempotent). |
wrap(crew) |
Per-instance wrap of a single Crew (no global side effects). Returns the wrapped crew. |
flush() |
Flush all pending events to the Raindrop API. |
shutdown() |
Flush remaining events and release resources. |
identify(user_id, traits) |
Identify a user with optional traits (Dict[str, str | int | bool | float]). |
track_signal(event_id, name, ...) |
Track a signal event. See track_signal signature in wrapper.py for the full keyword-only parameters. |
Module-level helpers
create_raindrop_crewai(...)— construct aRaindropCrewAIand return a no-op instance on failure (never crashes).setup_crewai(...)— convenience: construct aRaindropCrewAIand callsetup()in one step.
Known Limitations
- No per-tool spans / empty
event.toolCalls— the underlyingopentelemetry-instrumentation-crewaiinstrumentor only wrapsCrew.kickoff(workflow),Agent.execute_task(.agent),Task.execute_sync(.task), andLLM.call(LLM generation). It does NOT produce a span per tool invocation, soevent.toolCallson the dashboard will be empty for CrewAI runs. Tool usage surfaces only in the agent's final output text and indirectly in the agent/task span durations. - Streaming — CrewAI's
CrewStreamingOutputis returned to the caller as-is; the flat Raindrop event is shipped after the stream completes. finish_reasonis per-LLM, not per-crew —CrewOutputdoes not exposefinish_reason. Per-LLM finish reasons appear on the LLM trace spans inside the workflow trace, not on the flat event.- Python SDK feature surface — the Python SDK is module-level and does not expose
EventShipper/TraceShipperclasses. Theidentify()andtrack_signal()methods are pass-through wrappers around theraindrop.analytics.*module functions. - wrapt < 2 required for tracing — the OTel CrewAI instrumentor uses
wrapt.wrap_function_wrapper(..., module=...)which was removed in wrapt 2.0. The package's dev dependencies pinwrapt<2; production environments using wrapt 2.x will see no trace spans (the flat event still ships).
Testing
cd packages/crewai-python
pip install -e ".[crewai,dev]"
# All tests. Unit tests run unconditionally; e2e tests skip gracefully
# unless RAINDROP_WRITE_KEY + OPENAI_API_KEY + RAINDROP_DASHBOARD_TOKEN
# are all set (this is what CI does).
python -m pytest tests/ -v
# End-to-end against the live Raindrop backend (manual run, requires
# all three env vars). Get RAINDROP_DASHBOARD_TOKEN from app.raindrop.ai
# DevTools → Network → any backend.raindrop.ai request → Authorization
# header (token expires every ~30 min).
RAINDROP_WRITE_KEY=your-write-key \
RAINDROP_DASHBOARD_TOKEN=eyJ... \
OPENAI_API_KEY=sk-... \
python -m pytest tests/test_e2e.py -v
Documentation
Full documentation: Raindrop CrewAI Integration.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file raindrop_crewai-0.0.1.tar.gz.
File metadata
- Download URL: raindrop_crewai-0.0.1.tar.gz
- Upload date:
- Size: 57.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f97bd815cce47613101ab62d86167b4c8b64368ac62e568c4274c1778ca696c8
|
|
| MD5 |
0389f86a0f8961cc71d4e8b26bbdcdb9
|
|
| BLAKE2b-256 |
505e8e1b4d51cf38c2857f599ab749720058031779621ce95fe025b252c2c361
|
File details
Details for the file raindrop_crewai-0.0.1-py3-none-any.whl.
File metadata
- Download URL: raindrop_crewai-0.0.1-py3-none-any.whl
- Upload date:
- Size: 20.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
17cac4db4ed2097233592d9fea3a95128c35f2b34113a898852809746836ab32
|
|
| MD5 |
4f7fcefbfcc2cbaa5325c9358d3411cd
|
|
| BLAKE2b-256 |
bc633c157a490bdcb3b25834a88299498eb51484ea3c2e62c715a4f52ecedb70
|