Skip to main content

Sandflare Python SDK — Firecracker microVM sandboxes for AI agents

Project description

Sandflare Python SDK

The official Python SDK for Sandflare — Firecracker microVM sandboxes for AI agents.

Install

pip install sandflare

For local development (editable):

pip install -e sdk/python/

Quick start

import os
from sandflare import Sandbox

sb = Sandbox.create("agent", size="nano")
result = sb.exec("echo hello")
print(result.stdout)   # "hello"
sb.delete()

Environment variables

Variable Description
SANDFLARE_API_KEY API key (required)
SANDFLARE_API_URL Optional base URL override (default: https://api.sandflare.io)
PANDAAGENT_API_KEY Legacy alias for SANDFLARE_API_KEY
PANDAAGENT_BASE_URL Legacy alias for SANDFLARE_API_URL

All methods use X-API-Key header for authentication.


Sandbox

Create

# Basic
sb = Sandbox.create("my-agent")

# With size/tier
sb = Sandbox.create("my-agent", size="nano")   # 512 MB RAM
sb = Sandbox.create("my-agent", size="small")  # 1 GB RAM
sb = Sandbox.create("my-agent", size="medium") # 2 GB RAM

# With options
sb = Sandbox.create("my-agent",
    size="small",
    ttl_hours=2,
    ephemeral=True,          # auto-deleted on close
    metadata={"env": "prod", "version": "1.0"},
    label="production-run",
)

# From a snapshot
sb = Sandbox.create("restored", snapshot_id="<snapshot-uuid>")

sb.info is a SandboxInfo dataclass with these fields:

Field Type Description
name str Unique sandbox ID (e.g. sb-abc123)
status str ready, paused, stopped
tier str | None nano, small, medium, large, xl
label str Human-readable label
preview_url str | None https://3000-sb-abc123.sandflare.io
subdomain_url str | None https://sb-abc123.sandflare.io (port-less)
agent_url str | None Internal agent URL
metadata dict Custom key-value metadata
ephemeral bool True if auto-deleted on close
expires_at str | None ISO timestamp if TTL set
memory_mb int RAM in MB
vcpus int vCPU count

Get & List

# Get by name
sb = Sandbox.get("sb-abc123")

# List all
sandboxes = Sandbox.list()

# Filter by label
sandboxes = Sandbox.list(label="production-run")

# Filter by metadata
sandboxes = Sandbox.list(metadata={"env": "prod"})

Exec

# Run a shell command
result = sb.exec("echo hello")
print(result.stdout)   # "hello\n"
print(result.exit_code)  # 0
print(result.ok)         # True

# With env vars
result = sb.exec("echo $MY_VAR", env={"MY_VAR": "hello_world"})

# With working directory and timeout
result = sb.exec("ls", cwd="/tmp", timeout=10)

# Stream output in real-time
for event in sb.exec_stream("python3 train.py", timeout=120):
    if event.type in ("stdout", "stderr"):
        print(event.data, flush=True)
    elif event.type == "done":
        print(f"\nexited {event.exit_code}")

File I/O

# Write and read
sb.write_file("/tmp/hello.txt", "hello world")
content = sb.read_file("/tmp/hello.txt")   # str

# Binary files
sb.write_file("/tmp/data.bin", b"\x00\x01\x02")
raw = sb.read_file("/tmp/data.bin")

# List directory
entries = sb.ls("/tmp")
for e in entries:
    print(e.name, e.size, e.is_dir)

Run code

# Python
result = sb.run_python("print(2 + 2)")
print(result.stdout)   # "4\n"

# Node.js
result = sb.run_node("console.log(2 + 2)")
print(result.stdout)   # "4\n"

Lifecycle

sb.pause()         # snapshot memory to disk, free compute
sb.resume()        # restore from snapshot
sb.delete()        # destroy permanently
sb.close()         # alias for delete()

Snapshots

# Create snapshot (continues running after)
snap = sb.create_snapshot(
    name="after-data-load",
    description="Dataset v2 loaded",
    tags=["prod", "v2"],
)
print(snap.snapshot_id)   # UUID

# List snapshots for this sandbox
snaps = sb.list_snapshots()
for s in snaps:
    print(s.snapshot_id, s.name, s.status)

# Restore (creates new sandbox from snapshot)
restored = Sandbox.create("fork", snapshot_id=snap.snapshot_id)

Pause → auto-resume

When you call Sandbox.get(name) on a paused sandbox, it auto-resumes transparently:

sb.pause()
# ... later ...
sb2 = Sandbox.get(sb.name)  # auto-resumes, file state preserved

AI Memory (mem0-backed)

Sandbox-scoped memory: store learnings, preferences, and context that persist across runs and are semantically searchable.

# Sandbox-scoped memory
sb.memory.add("User prefers async patterns and strict typing")
sb.memory.add("Bug fix: always call .commit() after INSERT", category="bug_fix")
sb.memory.add("Project uses FastAPI + PostgreSQL", infer=True)  # LLM extraction (~30s async)

results = sb.memory.search("what database does this project use?")
for r in results:
    print(r["memory"])

memories = sb.memory.list()

# User-level memory (persists across ALL sandboxes)
from sandflare import UserMemory

mem = UserMemory()
mem.add("Prefers TypeScript over JavaScript")
results = mem.search("language preferences")
memories = mem.list()

Custom templates

from sandflare import Template

# Build from Dockerfile
job = Template.build(
    name="my-env",
    dockerfile="FROM ubuntu:22.04\nRUN apt-get update && apt-get install -y python3",
)

# Poll until ready
job = Template.wait_for_build(job.id, timeout_seconds=600)

# Use the template
sb = Sandbox.create("agent", template_id=job.id)

# Manage templates
jobs = Template.list()
job = Template.get_build_status("tmpl-abc123")
Template.delete("tmpl-abc123")

Auto-snapshots

# Enable: snapshot every 15 minutes
sb.enable_auto_snapshot(interval_mins=15)

# Disable
sb.disable_auto_snapshot()

Metrics & processes

m = sb.metrics()
print(f"CPU: {m.cpu_used_pct:.1f}%  RAM: {m.mem_used}/{m.mem_total}")

procs = sb.get_processes()
for p in procs:
    print(p.pid, p.command)

sb.kill_process(1234)

Git clone

result = sb.git_clone(
    "https://github.com/org/repo",
    path="/home/agent/repo",
    branch="main",
    depth=1,
)
print(result.output)

Build & publish

# Build wheel
cd sdk/python
python3 -m pip wheel . --no-deps -w dist/

# Publish to PyPI
TWINE_USERNAME=__token__ TWINE_PASSWORD=<token> python3 -m twine upload dist/*

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sandflare-2.1.19.tar.gz (27.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

sandflare-2.1.19-py3-none-any.whl (25.3 kB view details)

Uploaded Python 3

File details

Details for the file sandflare-2.1.19.tar.gz.

File metadata

  • Download URL: sandflare-2.1.19.tar.gz
  • Upload date:
  • Size: 27.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for sandflare-2.1.19.tar.gz
Algorithm Hash digest
SHA256 f25f88fbcb2d075b05c590cc1c170c5ed62056bb9baa4ef78523fc8af170a98c
MD5 71795891e0db80e475f11fa92915be71
BLAKE2b-256 c330e10c327ba0ea4ea228aca32fdd1e6c33688ef207da8e9d90f9bd75ed5db5

See more details on using hashes here.

File details

Details for the file sandflare-2.1.19-py3-none-any.whl.

File metadata

  • Download URL: sandflare-2.1.19-py3-none-any.whl
  • Upload date:
  • Size: 25.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for sandflare-2.1.19-py3-none-any.whl
Algorithm Hash digest
SHA256 08b2170d3d66b6cdacb9d5bbc70cbb69a68fa3ba4d295156d70fe1de567f3f2a
MD5 c09e17c4216f4d45bdf17bb7569a867e
BLAKE2b-256 1974f16839968e0bddaa953807f3f4f364bb41561949681fbaa0dccb21993222

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page