SwisperStudio SDK - Tracing integration for Swisper (internal use)
Project description
SwisperStudio SDK
Simple, high-performance integration for tracing Swisper LangGraph applications.
v0.4.0 - Redis Streams Architecture:
- ๐ 50x faster - 500ms โ 10ms overhead
- ๐ง LLM reasoning - See thinking process (
<think>...</think>) - ๐ก Connection status - Heartbeat-based health monitoring
- โ๏ธ Per-node config - Fine-grained control
Installation
From PyPI (Recommended)
pip install swisper-studio-sdk>=0.5.2
That's it! No authentication needed.
From Source (Development)
git clone https://github.com/Fintama/swisper_studio.git
cd swisper_studio/sdk
pip install -e .
Note: Source installation requires Fintama GitHub organization access.
Quick Start (30 seconds)
1. Initialize at Startup (Redis Streams)
# In your main.py or startup code
from swisper_studio_sdk import safe_initialize
# Async initialization (in lifespan or startup) - RECOMMENDED
status = await safe_initialize(
redis_url="redis://swisper_studio_redis:6379", # SwisperStudio Redis
project_id="your-project-id", # From SwisperStudio
enabled=True
)
if status["initialized"]:
logger.info("SwisperStudio tracing enabled")
else:
logger.info("Tracing disabled (Swisper continues normally)")
Note: safe_initialize() NEVER blocks or raises exceptions. If Redis is unavailable, Swisper continues normally with tracing disabled.
2. ONE LINE CHANGE to Enable Tracing
# Before:
from langgraph.graph import StateGraph
graph = StateGraph(GlobalSupervisorState)
# After (change ONE line):
from swisper_studio_sdk import create_traced_graph
graph = create_traced_graph(GlobalSupervisorState, trace_name="supervisor")
# That's it! All nodes added to this graph are automatically traced!
3. Add Nodes as Normal
# Add nodes - they're automatically traced!
graph.add_node("intent_classification", intent_classification_node)
graph.add_node("memory", memory_node)
graph.add_node("planner", planner_node)
graph.add_node("ui_node", ui_node)
# Compile and run as usual
app = graph.compile()
result = await app.ainvoke(initial_state)
# All executions are now traced to SwisperStudio! ๐
Features
Core Features:
- โ
One-line integration -
create_traced_graph()instead ofStateGraph() - โ Auto-instrumentation - All nodes automatically traced
- โ State capture - Captures input/output state at each node
- โ Error tracking - Captures exceptions and error messages
- โ Nested observations - Supports parent-child relationships
- โ Zero boilerplate - No decorators needed on individual nodes
v0.4.0 New Features:
- โ Redis Streams - 50x faster than HTTP (500ms โ 10ms)
- โ
LLM Reasoning - Captures
<think>...</think>tags from DeepSeek R1, o1, etc. - โ Streaming Support - Captures full responses from streaming LLM calls
- โ Connection Status - Verifies SwisperStudio consumer is running
- โ Per-Node Config - Enable/disable reasoning per node
- โ Memory Safety - Auto-cleanup prevents memory leaks
Advanced Usage
LLM Reasoning Capture
Control reasoning capture per node:
from swisper_studio_sdk import traced
# Enable reasoning with custom length limit
@traced("classify_intent", capture_reasoning=True, reasoning_max_length=20000)
async def classify_intent_node(state):
# Captures <think>...</think> tags (up to 20KB)
return state
# Disable reasoning for specific nodes
@traced("memory_node", capture_reasoning=False)
async def memory_node(state):
# No reasoning captured (faster, less data)
return state
# Use defaults (reasoning enabled, 50KB limit)
@traced("global_planner")
async def global_planner_node(state):
return state
What gets captured:
- โ LLM prompts (system + user messages)
- โ
Reasoning process (
<think>...</think>tags) - โ Final responses (structured output or streaming)
- โ Token usage (prompt + completion)
Supported models:
- DeepSeek R1 (with reasoning)
- OpenAI o1/o3 (with reasoning)
- GPT-4, Claude, Llama (no reasoning, just prompts + responses)
Manual Tracing (Optional)
For fine-grained control, use @traced decorator:
from swisper_studio_sdk import traced
# Full control over observation
@traced(
name="intent_classification",
observation_type="GENERATION",
capture_reasoning=True,
reasoning_max_length=10000
)
async def intent_classification_node(state):
return state
Observation Types
AUTO- Auto-detect based on LLM data (default, recommended)SPAN- Generic execution spanGENERATION- LLM generationEVENT- Point-in-time eventTOOL- Tool callAGENT- Agent execution
Architecture
Redis Streams (v0.4.0)
Your App (Swisper) Redis Stream SwisperStudio
โ โ โ
@traced decorator โ โ
โ โ โ
XADD event (1-2ms) โโโโโโโโโโโ โ โ
โ โ โ
Return immediately โ โ
(zero latency!) โ โ
โ Consumer reads batch โ
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโค
โ โ
โ Store in PostgreSQL โ
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโ
Benefits:
- 50x faster than HTTP (500ms โ 10ms overhead)
- No race conditions (ordered stream delivery)
- Reliable (persistent queue, automatic retry)
- Scalable (100k+ events/sec)
How It Works
create_traced_graph()monkey-patchesadd_node()to auto-wrap functions@traceddecorator publishes events to Redis Streams (1-2ms)- SwisperStudio consumer reads from stream and stores in database
- Zero user-facing latency (fire-and-forget pattern)
Configuration
Required Settings
# In your config.py or .env
SWISPER_STUDIO_REDIS_URL: str = "redis://redis:6379"
SWISPER_STUDIO_PROJECT_ID: str = "your-project-id"
SWISPER_STUDIO_STREAM_NAME: str = "observability:events"
Optional Settings
# Reasoning capture
SWISPER_STUDIO_CAPTURE_REASONING: bool = True
SWISPER_STUDIO_REASONING_MAX_LENGTH: int = 50000 # 50 KB
# Connection verification
SWISPER_STUDIO_VERIFY_CONSUMER: bool = True # Check consumer health
Requirements
- Python 3.11+
- LangGraph >= 1.0.0, < 2.0.0
- langgraph-checkpoint >= 2.1.0 (v0.5.2 removed upper bound for HITL compatibility)
- httpx >= 0.25.2
- redis >= 5.0.0
Migration
Upgrading from v0.3.x? See SDK_MIGRATION_v0.3.4_to_v0.4.0.md
Migration time: ~15 minutes
Breaking changes: None (backward compatible)
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file swisper_studio_sdk-0.5.3.tar.gz.
File metadata
- Download URL: swisper_studio_sdk-0.5.3.tar.gz
- Upload date:
- Size: 34.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.14
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
41cd6c6dc7c54630f9e56e6bfaa50cf87e722249e395ba1113d3084a2d359a6f
|
|
| MD5 |
732ac70486a9ec2aa3a229b0fc6b32b8
|
|
| BLAKE2b-256 |
7b5385c1a1b7cf3bba04df7bfbd3f9ec72a9358de5569596d6c64a78d8aad0fd
|
File details
Details for the file swisper_studio_sdk-0.5.3-py3-none-any.whl.
File metadata
- Download URL: swisper_studio_sdk-0.5.3-py3-none-any.whl
- Upload date:
- Size: 32.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.14
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c4b5731b6cd6fadbd1f8be55f4b3b87a76fef709dfcdd0ea7fd6aeaca56edd03
|
|
| MD5 |
379efaed743a8eed43c5b92f3cc81b81
|
|
| BLAKE2b-256 |
c756b52d6afe402ed75e78bdff24889572366d168535490880cd71357a276975
|