Local sandbox client for Firecracker/Docker execution with E2B/Daytona ergonomics
Project description
Firework 🎆
A high-level, local-only client library for managing Firecracker/Docker sandboxes. No authentication, no remote APIs – just pure local execution with E2B/Daytona ergonomics.
Features
- Zero-config by default: Works out-of-the-box with sensible local defaults
- Backend agnostic: Abstracts Firecracker, Docker, or custom runtimes
- Async-first: All operations are non-blocking with streaming support
- Resource-aware: Automatic cleanup, lifecycle management, metrics
- Pre-built environments: Python ML, PyTorch, TensorFlow, Node.js, and more
- No network required: Everything runs on localhost
Installation
pip install firework-sandbox
No API keys, no daemon required.
Quick Start
import asyncio
from firework import Sandbox
async def main():
# Create a sandbox with auto-cleanup
async with Sandbox.create(template="python-ml") as sandbox:
# Install packages
await sandbox.process.exec("pip install numpy pandas")
# Execute Python code
result = await sandbox.process.exec("python -c 'import numpy; print(numpy.__version__)'")
print(result.stdout) # "1.24.0\n"
# Upload a file
await sandbox.filesystem.write("/app/script.py", """
import pandas as pd
df = pd.DataFrame({'a': [1, 2, 3], 'b': [4, 5, 6]})
df.to_csv('/app/output.csv', index=False)
print(df.describe())
""")
# Run the script
result = await sandbox.process.exec("python /app/script.py")
print(result.stdout)
# Download the result
await sandbox.filesystem.download("/app/output.csv", "./output.csv")
asyncio.run(main())
Pre-built Environments
Firework comes with pre-configured environments for common use cases:
| Environment | Description | Packages |
|---|---|---|
base |
Minimal Python 3.11 | pip, setuptools |
python-ml |
Machine Learning | numpy, pandas, scikit-learn, matplotlib |
python-torch |
Deep Learning (PyTorch) | torch, torchvision, numpy |
python-tensorflow |
Deep Learning (TensorFlow) | tensorflow, keras, numpy |
python-data |
Data Engineering | polars, duckdb, pyarrow, sqlalchemy |
python-web |
Web Development | fastapi, uvicorn, httpx, pydantic |
python-llm |
LLM/AI Applications | openai, anthropic, langchain, transformers |
nodejs |
Node.js 20 | Node.js runtime |
nodejs-full |
Node.js with tools | typescript, ts-node |
# Use a specific environment
sandbox = await Sandbox.create(template="python-torch")
# List all available environments
from firework import list_environments
for env in list_environments():
print(f"{env['name']}: {env['description']}")
Sandbox Lifecycle
Create
from firework import Sandbox
# Minimal (recommended)
sandbox = await Sandbox.create(template="base")
# Full control
sandbox = await Sandbox.create(
template="python-ml",
name="data-analysis-123",
environment={"DEBUG": "true", "PYTHONUNBUFFERED": "1"},
vcpu=2,
memory_mb=1024,
timeout_seconds=3600,
)
print(f"ID: {sandbox.id}") # sbx_abc123...
print(f"Root: {sandbox.root_path}") # ~/.firework/sandboxes/sbx_abc123
Reconnect to Existing Sandbox
# Reconnect to a running sandbox
sandbox = await Sandbox.reconnect("sbx_abc123")
State Management
# Check state
state = await sandbox.get_state() # "running", "paused", "stopped"
# Pause/Resume (saves resources)
await sandbox.pause()
await sandbox.resume()
# Destroy (cleanup)
await sandbox.destroy()
Auto-cleanup with Context Manager
async with Sandbox.create(template="base") as sandbox:
result = await sandbox.process.exec("echo Hello")
# Automatically destroyed on exit
Process Execution
Blocking Execution
result = await sandbox.process.exec(
command="python /app/analyze.py",
cwd="/workspace",
environment={"INPUT": "/data/file.csv"},
timeout_seconds=30
)
print(result.stdout)
print(result.stderr)
print(f"Exit code: {result.exit_code}")
print(f"Runtime: {result.runtime_seconds}s")
Streaming Execution
stream = await sandbox.process.exec_stream(
command="python long_running.py",
cwd="/app"
)
async for event in stream:
match event.type:
case "stdout":
print(f"OUT: {event.content}", end="")
case "stderr":
print(f"ERR: {event.content}", end="")
case "exit":
print(f"DONE: exit code {event.exit_code}")
Background Processes
# Start a server in the background
server = await sandbox.process.start(
command="python -m http.server 8000",
cwd="/public",
background=True
)
# Check if running
running = await server.is_running()
# Wait for completion or kill
try:
exit_code = await server.wait(timeout_seconds=60)
except asyncio.TimeoutError:
await server.kill()
Batch Execution
# Run multiple commands sequentially
results = await sandbox.process.batch_exec([
"pip install -r requirements.txt",
"python setup.py",
"python train.py",
"python evaluate.py"
], stop_on_error=True)
for i, result in enumerate(results):
print(f"Command {i+1}: exit code {result.exit_code}")
Filesystem Operations
Read/Write Files
# Read text
content = await sandbox.filesystem.read("/config.json")
# Read binary
data = await sandbox.filesystem.read_bytes("/model.bin")
# Write text
await sandbox.filesystem.write("/output.txt", "Hello World")
# Write binary
await sandbox.filesystem.write_bytes("/model.weights", model_bytes)
Upload/Download Files
# Upload single file
await sandbox.filesystem.upload("./local.csv", "/data/input.csv")
# Download single file
await sandbox.filesystem.download("/data/output.json", "./result.json")
# Upload entire directory
await sandbox.filesystem.upload_dir("./project", "/workspace")
# Download entire directory
await sandbox.filesystem.download_dir("/results", "./local_results")
Directory Operations
# List directory contents
files = await sandbox.filesystem.list("/data")
for f in files:
print(f"{f.name}: {f.size} bytes ({f.type})")
# Create directory
await sandbox.filesystem.mkdir("/workspace/output", recursive=True)
# Remove file
await sandbox.filesystem.remove("/tmp/cache.txt")
# Remove directory
await sandbox.filesystem.remove_dir("/tmp/old_data")
# Check existence
exists = await sandbox.filesystem.exists("/app/script.py")
Observability
Metrics
metrics = await sandbox.get_metrics()
print(f"CPU: {metrics.cpu_percent}%")
print(f"Memory: {metrics.memory_mb} MB")
print(f"Disk: {metrics.disk_mb_used} MB")
print(f"Uptime: {metrics.uptime_seconds}s")
print(f"Processes: {metrics.process_count}")
Events
def on_created(sandbox):
print(f"Sandbox created: {sandbox.id}")
def on_destroyed(sandbox):
print(f"Sandbox destroyed: {sandbox.id}")
sandbox.on("created", on_created)
sandbox.on("destroyed", on_destroyed)
Error Handling
from firework import (
SandboxError,
SandboxNotFound,
SandboxTimeout,
ProcessExecutionError,
FilesystemError
)
try:
result = await sandbox.process.exec("nonexistent_command")
except ProcessExecutionError as e:
print(f"Command failed: exit code {e.exit_code}")
print(f"Stderr: {e.stderr}")
except SandboxTimeout as e:
print(f"Operation timed out after {e.timeout_seconds}s")
except SandboxNotFound as e:
print(f"Sandbox not found: {e.sandbox_id}")
except FilesystemError as e:
print(f"Filesystem error: {e.operation} on {e.path}")
except SandboxError as e:
print(f"General sandbox error: {e}")
CLI
# Create a sandbox
firework create --template python-ml --name my-sandbox
# List running sandboxes
firework list
# Execute command in sandbox
firework exec sbx_abc123 "python --version"
# Destroy sandbox
firework destroy sbx_abc123
# List available environments
firework env list
# Show environment details
firework env info python-ml
# Build an environment
firework env build python-torch --size 4096
Configuration
Programmatic Configuration
from firework import LocalConfig, set_config
config = LocalConfig(
runtime_dir="/custom/path/sandboxes",
env_dir="/custom/path/environments",
default_template="python-ml",
default_timeout=120,
default_vcpu=2,
default_memory_mb=1024,
log_level="DEBUG"
)
set_config(config)
Environment Variables
| Variable | Description | Default |
|---|---|---|
FIREWORK_RUNTIME_DIR |
Sandbox runtime directory | ~/.firework/sandboxes |
FIREWORK_ENV_DIR |
Built environments directory | ~/.firework/environments |
FIREWORK_DEFAULT_TEMPLATE |
Default template | base |
FIREWORK_DEFAULT_TIMEOUT |
Default timeout (seconds) | 60 |
FIREWORK_DEFAULT_VCPU |
Default vCPU count | 1 |
FIREWORK_DEFAULT_MEMORY_MB |
Default memory (MB) | 512 |
FIREWORK_LOG_LEVEL |
Log level | INFO |
export FIREWORK_RUNTIME_DIR=/data/sandboxes
export FIREWORK_DEFAULT_MEMORY_MB=2048
Complete Examples
Data Analysis Pipeline
async def analyze_csv(csv_path: str) -> dict:
async with Sandbox.create(template="python-ml") as sandbox:
# Upload data
await sandbox.filesystem.upload(csv_path, "/data/input.csv")
# Write analysis script
await sandbox.filesystem.write("/app/analyze.py", """
import pandas as pd
import json
df = pd.read_csv('/data/input.csv')
results = {
'rows': len(df),
'columns': list(df.columns),
'summary': df.describe().to_dict()
}
with open('/data/results.json', 'w') as f:
json.dump(results, f, indent=2)
print('Analysis complete!')
""")
# Run analysis
result = await sandbox.process.exec("python /app/analyze.py")
print(result.stdout)
# Download results
await sandbox.filesystem.download("/data/results.json", "./results.json")
# Read and return results
content = await sandbox.filesystem.read("/data/results.json")
return json.loads(content)
Machine Learning Training
async def train_model(data_path: str, epochs: int = 10):
async with Sandbox.create(
template="python-torch",
vcpu=2,
memory_mb=2048,
timeout_seconds=3600
) as sandbox:
# Upload training data
await sandbox.filesystem.upload_dir(data_path, "/data")
# Upload training script
await sandbox.filesystem.upload("./train.py", "/app/train.py")
# Train with streaming output
stream = await sandbox.process.exec_stream(
f"python /app/train.py --epochs {epochs}",
cwd="/app"
)
async for event in stream:
if event.type == "stdout":
print(event.content, end="")
# Download trained model
await sandbox.filesystem.download("/app/model.pt", "./model.pt")
Web Scraping
async def scrape_urls(urls: list[str]) -> list[dict]:
async with Sandbox.create(template="python-web") as sandbox:
# Install additional packages
await sandbox.process.exec("pip install beautifulsoup4 lxml")
# Write scraper script
await sandbox.filesystem.write("/app/scraper.py", f"""
import httpx
from bs4 import BeautifulSoup
import json
urls = {urls}
results = []
for url in urls:
try:
resp = httpx.get(url, timeout=10)
soup = BeautifulSoup(resp.text, 'lxml')
results.append({{
'url': url,
'title': soup.title.string if soup.title else None,
'status': resp.status_code
}})
except Exception as e:
results.append({{'url': url, 'error': str(e)}})
with open('/app/results.json', 'w') as f:
json.dump(results, f, indent=2)
""")
# Run scraper
await sandbox.process.exec("python /app/scraper.py", timeout_seconds=120)
# Get results
content = await sandbox.filesystem.read("/app/results.json")
return json.loads(content)
Parallel Processing with Worker Pool
async def parallel_process(items: list[str]) -> list[str]:
# Create a pool of sandboxes
pool = await asyncio.gather(*[
Sandbox.create(template="base") for _ in range(4)
])
try:
# Distribute work across pool
async def process_item(sandbox, item):
result = await sandbox.process.exec(f"echo 'Processing: {item}'")
return result.stdout.strip()
tasks = [
process_item(pool[i % len(pool)], item)
for i, item in enumerate(items)
]
return await asyncio.gather(*tasks)
finally:
# Cleanup all sandboxes
await asyncio.gather(*(s.destroy() for s in pool))
Requirements
- Python 3.10+
- Docker (for Docker backend)
- Firecracker (optional, for microVM backend)
License
MIT License - see LICENSE for details.
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file firework_sandbox-0.1.1.tar.gz.
File metadata
- Download URL: firework_sandbox-0.1.1.tar.gz
- Upload date:
- Size: 27.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
108fa57087295716471d3d5e7a51eb50117aedebc72169c052a261840b8f1fdd
|
|
| MD5 |
1ac838bcdf9c6b699d696c5a05ffaa22
|
|
| BLAKE2b-256 |
aa48d72c95e5eabba88962b62ab6540027b44d3ac1e542bda06f19642f81e7c9
|
File details
Details for the file firework_sandbox-0.1.1-py3-none-any.whl.
File metadata
- Download URL: firework_sandbox-0.1.1-py3-none-any.whl
- Upload date:
- Size: 26.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7119ae9efdc02382f9e063b43a327a0691e5f878da442f4a497b648802ae1f40
|
|
| MD5 |
dff162c84e7d92ef844ef96d6fe23354
|
|
| BLAKE2b-256 |
084f811ca5b01e57a4ffed69f3a430f7ad43cf199cfce9d61da6eaa97f77d00e
|