Sandflare Python SDK — Firecracker microVM sandboxes for AI agents
Project description
Sandflare Python SDK
The official Python SDK for Sandflare — Firecracker microVM sandboxes for AI agents.
Install
pip install sandflare
For local development (editable):
pip install -e sdk/python/
Quick start
import os
from sandflare import Sandbox
sb = Sandbox.create("agent", size="nano")
result = sb.exec("echo hello")
print(result.stdout) # "hello"
sb.delete()
Environment variables
| Variable | Description |
|---|---|
SANDFLARE_API_KEY |
API key (required) |
SANDFLARE_API_URL |
Optional base URL override (default: https://api.sandflare.io) |
PANDAAGENT_API_KEY |
Legacy alias for SANDFLARE_API_KEY |
PANDAAGENT_BASE_URL |
Legacy alias for SANDFLARE_API_URL |
All methods use X-API-Key header for authentication.
Sandbox
Create
# Basic
sb = Sandbox.create("my-agent")
# With size/tier
sb = Sandbox.create("my-agent", size="nano") # 512 MB RAM
sb = Sandbox.create("my-agent", size="small") # 1 GB RAM
sb = Sandbox.create("my-agent", size="medium") # 2 GB RAM
# With options
sb = Sandbox.create("my-agent",
size="small",
ttl_hours=2,
ephemeral=True, # auto-deleted on close
metadata={"env": "prod", "version": "1.0"},
label="production-run",
)
# From a snapshot
sb = Sandbox.create("restored", snapshot_id="<snapshot-uuid>")
sb.info is a SandboxInfo dataclass with these fields:
| Field | Type | Description |
|---|---|---|
name |
str |
Unique sandbox ID (e.g. sb-abc123) |
status |
str |
ready, paused, stopped |
tier |
str | None |
nano, small, medium, large, xl |
label |
str |
Human-readable label |
preview_url |
str | None |
https://3000-sb-abc123.sandflare.io |
subdomain_url |
str | None |
https://sb-abc123.sandflare.io (port-less) |
agent_url |
str | None |
Internal agent URL |
metadata |
dict |
Custom key-value metadata |
ephemeral |
bool |
True if auto-deleted on close |
expires_at |
str | None |
ISO timestamp if TTL set |
memory_mb |
int |
RAM in MB |
vcpus |
int |
vCPU count |
Get & List
# Get by name
sb = Sandbox.get("sb-abc123")
# List all
sandboxes = Sandbox.list()
# Filter by label
sandboxes = Sandbox.list(label="production-run")
# Filter by metadata
sandboxes = Sandbox.list(metadata={"env": "prod"})
Exec
# Run a shell command
result = sb.exec("echo hello")
print(result.stdout) # "hello\n"
print(result.exit_code) # 0
print(result.ok) # True
# With env vars
result = sb.exec("echo $MY_VAR", env={"MY_VAR": "hello_world"})
# With working directory and timeout
result = sb.exec("ls", cwd="/tmp", timeout=10)
# Stream output in real-time
for event in sb.exec_stream("python3 train.py", timeout=120):
if event.type in ("stdout", "stderr"):
print(event.data, flush=True)
elif event.type == "done":
print(f"\nexited {event.exit_code}")
File I/O
# Write and read
sb.write_file("/tmp/hello.txt", "hello world")
content = sb.read_file("/tmp/hello.txt") # str
# Binary files
sb.write_file("/tmp/data.bin", b"\x00\x01\x02")
raw = sb.read_file("/tmp/data.bin")
# List directory
entries = sb.ls("/tmp")
for e in entries:
print(e.name, e.size, e.is_dir)
Run code
# Python
result = sb.run_python("print(2 + 2)")
print(result.stdout) # "4\n"
# Node.js
result = sb.run_node("console.log(2 + 2)")
print(result.stdout) # "4\n"
Lifecycle
sb.pause() # snapshot memory to disk, free compute
sb.resume() # restore from snapshot
sb.delete() # destroy permanently
sb.close() # alias for delete()
Snapshots
# Create snapshot (continues running after)
snap = sb.create_snapshot(
name="after-data-load",
description="Dataset v2 loaded",
tags=["prod", "v2"],
)
print(snap.snapshot_id) # UUID
# List snapshots for this sandbox
snaps = sb.list_snapshots()
for s in snaps:
print(s.snapshot_id, s.name, s.status)
# Restore (creates new sandbox from snapshot)
restored = Sandbox.create("fork", snapshot_id=snap.snapshot_id)
Pause → auto-resume
When you call Sandbox.get(name) on a paused sandbox, it auto-resumes transparently:
sb.pause()
# ... later ...
sb2 = Sandbox.get(sb.name) # auto-resumes, file state preserved
AI Memory (mem0-backed)
Sandbox-scoped memory: store learnings, preferences, and context that persist across runs and are semantically searchable.
# Sandbox-scoped memory
sb.memory.add("User prefers async patterns and strict typing")
sb.memory.add("Bug fix: always call .commit() after INSERT", category="bug_fix")
sb.memory.add("Project uses FastAPI + PostgreSQL", infer=True) # LLM extraction (~30s async)
results = sb.memory.search("what database does this project use?")
for r in results:
print(r["memory"])
memories = sb.memory.list()
# User-level memory (persists across ALL sandboxes)
from sandflare import UserMemory
mem = UserMemory()
mem.add("Prefers TypeScript over JavaScript")
results = mem.search("language preferences")
memories = mem.list()
Custom templates
from sandflare import Template
# Build from Dockerfile
job = Template.build(
name="my-env",
dockerfile="FROM ubuntu:22.04\nRUN apt-get update && apt-get install -y python3",
)
# Poll until ready
job = Template.wait_for_build(job.id, timeout_seconds=600)
# Use the template
sb = Sandbox.create("agent", template_id=job.id)
# Manage templates
jobs = Template.list()
job = Template.get_build_status("tmpl-abc123")
Template.delete("tmpl-abc123")
Auto-snapshots
# Enable: snapshot every 15 minutes
sb.enable_auto_snapshot(interval_mins=15)
# Disable
sb.disable_auto_snapshot()
Metrics & processes
m = sb.metrics()
print(f"CPU: {m.cpu_used_pct:.1f}% RAM: {m.mem_used}/{m.mem_total}")
procs = sb.get_processes()
for p in procs:
print(p.pid, p.command)
sb.kill_process(1234)
Git clone
result = sb.git_clone(
"https://github.com/org/repo",
path="/home/agent/repo",
branch="main",
depth=1,
)
print(result.output)
Build & publish
# Build wheel
cd sdk/python
python3 -m pip wheel . --no-deps -w dist/
# Publish to PyPI
TWINE_USERNAME=__token__ TWINE_PASSWORD=<token> python3 -m twine upload dist/*
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file sandflare-2.1.21.tar.gz.
File metadata
- Download URL: sandflare-2.1.21.tar.gz
- Upload date:
- Size: 27.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
657935d95c9d3cc39243b85b8a0653889410a513c67f700eeb10ec66f85ea92b
|
|
| MD5 |
96891612e86cbe7d9c9435348248cc64
|
|
| BLAKE2b-256 |
a88b5b618d1de13ed3dca77d729af3a79193f16b49b08e3e04cf30d0862b99bf
|
File details
Details for the file sandflare-2.1.21-py3-none-any.whl.
File metadata
- Download URL: sandflare-2.1.21-py3-none-any.whl
- Upload date:
- Size: 25.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
778ce9b29230c2c0f5ee891f806e9322c5da1081e099b110c4240b3dd8928971
|
|
| MD5 |
a45c698f3d4ccf60002108d673ecee7c
|
|
| BLAKE2b-256 |
d20c816864701db4d6d7067bf55adb5919415a5d85bcfaeabd379efdca9fca44
|