Skip to main content

Sandflare Python SDK — Firecracker microVM sandboxes for AI agents

Project description

Sandflare Python SDK

The official Python SDK for Sandflare — Firecracker microVM sandboxes for AI agents.

Install

pip install sandflare

For local development (editable):

pip install -e sdk/python/

Quick start

import os
from sandflare import Sandbox

sb = Sandbox.create("agent", size="nano")
result = sb.exec("echo hello")
print(result.stdout)   # "hello"
sb.delete()

Environment variables

Variable Description
SANDFLARE_API_KEY API key (required)
SANDFLARE_API_URL Optional base URL override (default: https://api.sandflare.io)
PANDAAGENT_API_KEY Legacy alias for SANDFLARE_API_KEY
PANDAAGENT_BASE_URL Legacy alias for SANDFLARE_API_URL

All methods use X-API-Key header for authentication.


Sandbox

Create

# Basic
sb = Sandbox.create("my-agent")

# With size/tier
sb = Sandbox.create("my-agent", size="nano")   # 512 MB RAM
sb = Sandbox.create("my-agent", size="small")  # 1 GB RAM
sb = Sandbox.create("my-agent", size="medium") # 2 GB RAM

# With options
sb = Sandbox.create("my-agent",
    size="small",
    ttl_hours=2,
    ephemeral=True,          # auto-deleted on close
    metadata={"env": "prod", "version": "1.0"},
    label="production-run",
)

# From a snapshot
sb = Sandbox.create("restored", snapshot_id="<snapshot-uuid>")

sb.info is a SandboxInfo dataclass with these fields:

Field Type Description
name str Unique sandbox ID (e.g. sb-abc123)
status str ready, paused, stopped
tier str | None nano, small, medium, large, xl
label str Human-readable label
preview_url str | None https://3000-sb-abc123.sandflare.io
subdomain_url str | None https://sb-abc123.sandflare.io (port-less)
agent_url str | None Internal agent URL
metadata dict Custom key-value metadata
ephemeral bool True if auto-deleted on close
expires_at str | None ISO timestamp if TTL set
memory_mb int RAM in MB
vcpus int vCPU count

Get & List

# Get by name
sb = Sandbox.get("sb-abc123")

# List all
sandboxes = Sandbox.list()

# Filter by label
sandboxes = Sandbox.list(label="production-run")

# Filter by metadata
sandboxes = Sandbox.list(metadata={"env": "prod"})

Exec

# Run a shell command
result = sb.exec("echo hello")
print(result.stdout)   # "hello\n"
print(result.exit_code)  # 0
print(result.ok)         # True

# With env vars
result = sb.exec("echo $MY_VAR", env={"MY_VAR": "hello_world"})

# With working directory and timeout
result = sb.exec("ls", cwd="/tmp", timeout=10)

# Stream output in real-time
for event in sb.exec_stream("python3 train.py", timeout=120):
    if event.type in ("stdout", "stderr"):
        print(event.data, flush=True)
    elif event.type == "done":
        print(f"\nexited {event.exit_code}")

File I/O

# Write and read
sb.write_file("/tmp/hello.txt", "hello world")
content = sb.read_file("/tmp/hello.txt")   # str

# Binary files
sb.write_file("/tmp/data.bin", b"\x00\x01\x02")
raw = sb.read_file("/tmp/data.bin")

# List directory
entries = sb.ls("/tmp")
for e in entries:
    print(e.name, e.size, e.is_dir)

Run code

# Python
result = sb.run_python("print(2 + 2)")
print(result.stdout)   # "4\n"

# Node.js
result = sb.run_node("console.log(2 + 2)")
print(result.stdout)   # "4\n"

Lifecycle

sb.pause()         # snapshot memory to disk, free compute
sb.resume()        # restore from snapshot
sb.delete()        # destroy permanently
sb.close()         # alias for delete()

Snapshots

# Create snapshot (continues running after)
snap = sb.create_snapshot(
    name="after-data-load",
    description="Dataset v2 loaded",
    tags=["prod", "v2"],
)
print(snap.snapshot_id)   # UUID

# List snapshots for this sandbox
snaps = sb.list_snapshots()
for s in snaps:
    print(s.snapshot_id, s.name, s.status)

# Restore (creates new sandbox from snapshot)
restored = Sandbox.create("fork", snapshot_id=snap.snapshot_id)

Pause → auto-resume

When you call Sandbox.get(name) on a paused sandbox, it auto-resumes transparently:

sb.pause()
# ... later ...
sb2 = Sandbox.get(sb.name)  # auto-resumes, file state preserved

AI Memory (mem0-backed)

Sandbox-scoped memory: store learnings, preferences, and context that persist across runs and are semantically searchable.

# Sandbox-scoped memory
sb.memory.add("User prefers async patterns and strict typing")
sb.memory.add("Bug fix: always call .commit() after INSERT", category="bug_fix")
sb.memory.add("Project uses FastAPI + PostgreSQL", infer=True)  # LLM extraction (~30s async)

results = sb.memory.search("what database does this project use?")
for r in results:
    print(r["memory"])

memories = sb.memory.list()

# User-level memory (persists across ALL sandboxes)
from sandflare import UserMemory

mem = UserMemory()
mem.add("Prefers TypeScript over JavaScript")
results = mem.search("language preferences")
memories = mem.list()

Custom templates

from sandflare import Template

# Build from Dockerfile
job = Template.build(
    name="my-env",
    dockerfile="FROM ubuntu:22.04\nRUN apt-get update && apt-get install -y python3",
)

# Poll until ready
job = Template.wait_for_build(job.id, timeout_seconds=600)

# Use the template
sb = Sandbox.create("agent", template_id=job.id)

# Manage templates
jobs = Template.list()
job = Template.get_build_status("tmpl-abc123")
Template.delete("tmpl-abc123")

Auto-snapshots

# Enable: snapshot every 15 minutes
sb.enable_auto_snapshot(interval_mins=15)

# Disable
sb.disable_auto_snapshot()

Metrics & processes

m = sb.metrics()
print(f"CPU: {m.cpu_used_pct:.1f}%  RAM: {m.mem_used}/{m.mem_total}")

procs = sb.get_processes()
for p in procs:
    print(p.pid, p.command)

sb.kill_process(1234)

Git clone

result = sb.git_clone(
    "https://github.com/org/repo",
    path="/home/agent/repo",
    branch="main",
    depth=1,
)
print(result.output)

Build & publish

# Build wheel
cd sdk/python
python3 -m pip wheel . --no-deps -w dist/

# Publish to PyPI
TWINE_USERNAME=__token__ TWINE_PASSWORD=<token> python3 -m twine upload dist/*

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sandflare-2.1.16.tar.gz (27.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

sandflare-2.1.16-py3-none-any.whl (25.3 kB view details)

Uploaded Python 3

File details

Details for the file sandflare-2.1.16.tar.gz.

File metadata

  • Download URL: sandflare-2.1.16.tar.gz
  • Upload date:
  • Size: 27.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for sandflare-2.1.16.tar.gz
Algorithm Hash digest
SHA256 e8a8c6f393127acb512c2d3982a0c8e1f585deb118f07f981970a6f9bf2566ae
MD5 aa022c22e1c91f4138566da6e228b302
BLAKE2b-256 eb7fb5762d34c710902c65d525b14ab27ae7a7b4a9c23c490ef72527ff8a98f7

See more details on using hashes here.

File details

Details for the file sandflare-2.1.16-py3-none-any.whl.

File metadata

  • Download URL: sandflare-2.1.16-py3-none-any.whl
  • Upload date:
  • Size: 25.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for sandflare-2.1.16-py3-none-any.whl
Algorithm Hash digest
SHA256 1814dada9895b98457d49c2c787529d55c2c6ebe1dd81ad6138181eefeaa573f
MD5 07c2d236ae4415b0ff0481d90f9aef5d
BLAKE2b-256 7a8ba4a4f0a3022c4fd90a2e9169b0d9e52f56e715ce6bfc5c0ef03037c7e116

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page