Skip to main content

SDK for Chalk sandboxes, containers, and volumes

Project description

Chalk Sandbox SDK

Python SDK for the Chalk Sandbox gRPC service. Create sandboxes, execute commands, and stream output over bidirectional gRPC streams.

Install

pip install grpcio protobuf

Quick start

from sandbox import SandboxClient

client = SandboxClient("localhost:50051")

# Create a sandbox from a pre-built image
sandbox = client.create(image="ubuntu:latest", name="my-sandbox")

# Run a command
result = sandbox.exec("echo", "hello world")
print(result.stdout_text)  # "hello world"
print(result.exit_code)    # 0

# Clean up
sandbox.terminate()

Declarative images

Build custom container images with a fluent API instead of writing Dockerfiles. The image spec is serialized as protobuf and transmitted to the sandbox service, which builds and caches the image before starting the container.

from sandbox import SandboxClient
from image import Image

client = SandboxClient("localhost:50051")

# Build a data-science image declaratively
img = (
    Image.debian_slim("3.12")
    .pip_install(["pandas", "numpy", "scikit-learn"])
    .run_commands(
        "apt-get update && apt-get install -y git curl",
    )
    .workdir("/home/user/app")
    .env({"PYTHONDONTWRITEBYTECODE": "1"})
)

sandbox = client.create(image=img, name="data-science")
result = sandbox.exec("python", "-c", "import pandas; print(pandas.__version__)")
print(result.stdout_text)

Base images

# Arbitrary base image
img = Image.base("node:22-slim")

# Convenience: python + debian slim
img = Image.debian_slim("3.12")  # python:3.12-slim-bookworm

# From an existing Dockerfile (contents are inlined, so you can chain more steps)
img = Image.from_dockerfile("Dockerfile").pip_install(["extra-dep"])

Build steps

img = (
    Image.debian_slim("3.12")
    # Install Python packages
    .pip_install(["requests>=2.28", "flask"])

    # Install from a requirements.txt (read locally, inlined into the spec)
    .pip_install_from_requirements("requirements.txt")

    # Run shell commands (each becomes a Docker RUN layer)
    .run_commands(
        "apt-get update && apt-get install -y git",
        "mkdir -p /app/data",
    )

    # Add local files into the image
    .add_local_file("config.yaml", "/app/config.yaml")
    .add_local_file("entrypoint.sh", "/app/entrypoint.sh", mode=0o755)
    .add_local_dir("src", "/app/src")

    # Raw Dockerfile instructions
    .dockerfile_commands(["EXPOSE 8080", "HEALTHCHECK CMD curl -f http://localhost:8080/"])

    # Image-level configuration
    .workdir("/app")
    .env({"FLASK_APP": "app:create_app"})
    .entrypoint(["/app/entrypoint.sh"])
    .cmd(["serve"])
)

Immutable composition

Each builder method returns a new Image, so intermediate images can be shared:

base = Image.debian_slim("3.12").pip_install(["requests"])

# Two different images that share the same base
api_image = base.pip_install(["flask"]).workdir("/api")
worker_image = base.pip_install(["celery"]).workdir("/worker")

api_sandbox = client.create(image=api_image, name="api")
worker_sandbox = client.create(image=worker_image, name="worker")

Connecting

from sandbox import SandboxClient
import grpc

# Insecure (local dev)
client = SandboxClient("localhost:50051")

# With TLS
creds = grpc.ssl_channel_credentials()
client = SandboxClient("sandbox.example.com:443", credentials=creds)

# As a context manager
with SandboxClient("localhost:50051") as client:
    ...

Sandbox lifecycle

# Create with resource limits
sandbox = client.create(
    image="ubuntu:latest",
    name="dev-sandbox",
    cpu="2",
    memory="4Gi",
    env={"DEBIAN_FRONTEND": "noninteractive"},
)

# List all sandboxes
for info in client.list():
    print(f"{info.id} {info.state} {info.name}")

# Get a handle to an existing sandbox by ID
sandbox = client.get(id="550e8400-e29b-41d4-a716-446655440000")

# Fetch info from server
print(sandbox.info.state)
sandbox.refresh()  # force re-fetch

# Terminate
sandbox.terminate()
sandbox.terminate(grace_period_seconds=30)

Executing commands

Run and wait

result = sandbox.exec("ls", "-la", "/tmp")
for line in result.stdout:
    print(line)
for line in result.stderr:
    print(f"ERR: {line}")
print(f"exit code: {result.exit_code}")

# Or get the full text at once
print(result.stdout_text)
print(result.stderr_text)

Stream output in real time

for event in sandbox.exec_stream("make", "build", workdir="/app"):
    if event.stdout:
        print(event.stdout, end="")
    if event.stderr:
        print(event.stderr, end="", file=sys.stderr)
    if event.is_exited:
        print(f"\nDone: exit code {event.exit_code}")

Interactive processes (stdin + signals)

process = sandbox.exec_start("bash")

process.write_stdin("echo hello\n")
process.write_stdin("exit\n")
process.close_stdin()

for event in process.output():
    if event.stdout:
        print(event.stdout, end="")

Send signals to running processes:

import signal

process = sandbox.exec_start("sleep", "300")
process.send_signal(signal.SIGTERM)
result = process.wait()

Options

All exec methods accept the same keyword arguments:

result = sandbox.exec(
    "python", "train.py",
    workdir="/app",                     # working directory
    timeout_secs=3600,                  # kill after 1 hour
    env={"CUDA_VISIBLE_DEVICES": "0"},  # environment variables
)

Examples

Clone a GitHub repo into a sandbox

from sandbox import SandboxClient

client = SandboxClient("localhost:50051")
sandbox = client.create(image="ubuntu:latest", name="repo-sandbox")

# Install git
sandbox.exec("apt-get", "update")
sandbox.exec("apt-get", "install", "-y", "git")

# Clone
result = sandbox.exec(
    "git", "clone", "https://github.com/chalk-ai/chalk.git", "/workspace/chalk"
)
if result.exit_code != 0:
    print(f"Clone failed: {result.stderr_text}")
else:
    # List what we got
    result = sandbox.exec("ls", "-la", "/workspace/chalk")
    for line in result.stdout:
        print(line)

Spawn an OpenCode agent in a sandbox

OpenCode is a terminal-based AI coding agent. You can run it inside a sandbox to give it an isolated environment to work in.

from sandbox import SandboxClient

client = SandboxClient("localhost:50051")
sandbox = client.create(
    image="ubuntu:latest",
    name="opencode-agent",
    cpu="2",
    memory="4Gi",
    env={
        "ANTHROPIC_API_KEY": "sk-ant-...",
    },
)

# Install dependencies
sandbox.exec("apt-get", "update")
sandbox.exec("apt-get", "install", "-y", "git", "curl", "build-essential")

# Install Go (opencode is a Go binary)
sandbox.exec("bash", "-c", "curl -fsSL https://go.dev/dl/go1.24.1.linux-amd64.tar.gz | tar -C /usr/local -xz")
sandbox.exec("bash", "-c", "echo 'export PATH=$PATH:/usr/local/go/bin:/root/go/bin' >> /root/.bashrc")

# Install opencode
sandbox.exec("bash", "-c", "export PATH=$PATH:/usr/local/go/bin:/root/go/bin && go install github.com/opencode-ai/opencode@latest")

# Clone a repo to work on
sandbox.exec("git", "clone", "https://github.com/your-org/your-repo.git", "/workspace/repo")

# Run opencode non-interactively with a prompt
result = sandbox.exec(
    "bash", "-c",
    "export PATH=$PATH:/usr/local/go/bin:/root/go/bin && cd /workspace/repo && opencode -p 'fix the failing tests in pkg/auth'",
    timeout_secs=600,
)
print(result.stdout_text)

# Or run it interactively and feed it commands
process = sandbox.exec_start(
    "bash", "-c",
    "export PATH=$PATH:/usr/local/go/bin:/root/go/bin && cd /workspace/repo && opencode",
)

# Stream its output
for event in process.output():
    if event.stdout:
        print(event.stdout, end="")
    if event.stderr:
        print(event.stderr, end="", file=sys.stderr)
    if event.is_exited:
        break

Long-running build with real-time output

sandbox = client.create(image="node:22", name="build")

sandbox.exec("git", "clone", "https://github.com/your-org/frontend.git", "/app")
sandbox.exec("npm", "install", workdir="/app")

# Stream the build output as it happens
for event in sandbox.exec_stream("npm", "run", "build", workdir="/app"):
    if event.stdout:
        print(event.stdout, end="")
    if event.stderr:
        print(event.stderr, end="", file=sys.stderr)
    if event.is_exited and event.exit_code != 0:
        print(f"Build failed with exit code {event.exit_code}")

sandbox.terminate()

CLI tools

sandbox_exec.py - Run a command

python sandbox_exec.py --target localhost:50051 --sandbox-id <id> --exec "ls -la"

sandbox_stdout.py - Interactive shell

echo "echo hello" | python sandbox_stdout.py --target localhost:50051 --sandbox-id <id> --exec "bash"

Regenerating proto stubs

If the proto definition changes, regenerate the Python stubs:

pip install grpcio-tools
./generate.sh

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

chalkcompute-1.4.2.tar.gz (129.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

chalkcompute-1.4.2-py3-none-any.whl (170.0 kB view details)

Uploaded Python 3

File details

Details for the file chalkcompute-1.4.2.tar.gz.

File metadata

  • Download URL: chalkcompute-1.4.2.tar.gz
  • Upload date:
  • Size: 129.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for chalkcompute-1.4.2.tar.gz
Algorithm Hash digest
SHA256 61dbfdffbd579001e9e48b8053bd71a257a8e04d0da3939263bd48e2cc6294c7
MD5 9e8ffae8de36bbeb61652ac1cefae783
BLAKE2b-256 0d51bfb0559918abc44d5291f46cb7f68bcabba70ff520115d8a543a7b112b51

See more details on using hashes here.

Provenance

The following attestation bundles were made for chalkcompute-1.4.2.tar.gz:

Publisher: release.yml on chalk-ai/chalk-sandbox-sdk

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file chalkcompute-1.4.2-py3-none-any.whl.

File metadata

  • Download URL: chalkcompute-1.4.2-py3-none-any.whl
  • Upload date:
  • Size: 170.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for chalkcompute-1.4.2-py3-none-any.whl
Algorithm Hash digest
SHA256 a0174773565b968e8165a15cb0e8e33c1dfce42202a00e65c8695f6e3ca49103
MD5 d594a1fb2827b0fe4b6b3be7ea2d24b4
BLAKE2b-256 34de1e67df147fd77d8e5a79acbe4e4daf2bf06067784965d0ef50da835f4a06

See more details on using hashes here.

Provenance

The following attestation bundles were made for chalkcompute-1.4.2-py3-none-any.whl:

Publisher: release.yml on chalk-ai/chalk-sandbox-sdk

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page