Skip to main content

File storage and sandbox backends for AI agents

Project description

File Storage & Sandbox Backends for Pydantic AI

Console Toolset, Docker Sandbox, and Permission System for AI Agents

PyPI version Python 3.10+ License: MIT CI Pydantic AI

Console Toolset — ls, read, write, edit, grep, execute  •  Docker Sandbox — isolated code execution  •  Permission System — fine-grained access control


File Storage & Sandbox Backends provides everything your Pydantic AI agent needs to work with files and execute code safely. Choose from in-memory, local filesystem, or Docker-isolated backends.

Full framework? Check out Pydantic Deep Agents — complete agent framework with planning, filesystem, subagents, and skills.

Use Cases

What You Want to Build How This Library Helps
AI Coding Assistant Console toolset with file ops + code execution
Multi-User Web App Docker sandboxes with session isolation
Code Review Bot Read-only backend with grep/glob search
Secure Execution Permission system blocks dangerous operations
Testing/CI In-memory StateBackend for fast, isolated tests

Installation

pip install pydantic-ai-backend

Or with uv:

uv add pydantic-ai-backend

Optional extras:

# Console toolset (requires pydantic-ai)
pip install pydantic-ai-backend[console]

# Docker sandbox support
pip install pydantic-ai-backend[docker]

# Everything
pip install pydantic-ai-backend[console,docker]

Quick Start — ConsoleCapability (Recommended)

The simplest way to give your agent filesystem tools:

from pydantic_ai import Agent
from pydantic_ai_backends import ConsoleCapability

agent = Agent("openai:gpt-4.1", capabilities=[ConsoleCapability()])

With Permissions

from pydantic_ai_backends import ConsoleCapability
from pydantic_ai_backends.permissions import READONLY_RULESET

# Read-only agent — write/edit/execute tools are hidden from the model
agent = Agent("openai:gpt-4.1", capabilities=[ConsoleCapability(permissions=READONLY_RULESET)])

Alternative: Toolset API

from dataclasses import dataclass
from pydantic_ai import Agent
from pydantic_ai_backends import LocalBackend, create_console_toolset

@dataclass
class Deps:
    backend: LocalBackend

agent = Agent(
    "openai:gpt-4.1",
    deps_type=Deps,
    toolsets=[create_console_toolset()],
)

backend = LocalBackend(root_dir="./workspace")
result = agent.run_sync(
    "Create a Python script that calculates fibonacci and run it",
    deps=Deps(backend=backend),
)
print(result.output)

That's it. Your agent can now:

  • List files and directories (ls)
  • Read and write files (read_file, write_file)
  • Edit files with string replacement (edit_file)
  • Search with glob patterns and regex (glob, grep)
  • Execute shell commands (execute)

Available Backends

Backend Storage Execution Use Case
StateBackend In-memory No Testing, ephemeral sessions
LocalBackend Filesystem Yes Local development, CLI tools
DockerSandbox Container Yes Multi-user, untrusted code
CompositeBackend Routed Varies Complex multi-source setups

In-Memory (StateBackend)

from pydantic_ai_backends import StateBackend

backend = StateBackend()
# Files stored in memory, perfect for tests

Local Filesystem (LocalBackend)

from pydantic_ai_backends import LocalBackend

backend = LocalBackend(
    root_dir="/workspace",
    allowed_directories=["/workspace", "/shared"],
    enable_execute=True,
)

Docker Sandbox (DockerSandbox)

from pydantic_ai_backends import DockerSandbox

sandbox = DockerSandbox(runtime="python-datascience")
sandbox.start()
# Fully isolated container environment
sandbox.stop()

Console Toolset

Ready-to-use tools for pydantic-ai agents:

from pydantic_ai_backends import create_console_toolset

# All tools enabled
toolset = create_console_toolset()

# Without shell execution
toolset = create_console_toolset(include_execute=False)

# With approval requirements
toolset = create_console_toolset(
    require_write_approval=True,
    require_execute_approval=True,
)

# With custom tool descriptions
toolset = create_console_toolset(
    descriptions={
        "execute": "Run shell commands in the workspace",
        "read_file": "Read file contents from the workspace",
    }
)

Available tools: ls, read_file, write_file, edit_file, glob, grep, execute

Image Support

For multimodal models, enable image file handling:

toolset = create_console_toolset(image_support=True)

# Now read_file on .png/.jpg/.gif/.webp returns BinaryContent
# that multimodal models (GPT-4o, Claude, etc.) can see directly

Permission System

Fine-grained access control:

from pydantic_ai_backends import LocalBackend
from pydantic_ai_backends.permissions import DEFAULT_RULESET, READONLY_RULESET

# Safe defaults (allow reads, ask for writes)
backend = LocalBackend(root_dir="/workspace", permissions=DEFAULT_RULESET)

# Read-only mode
backend = LocalBackend(root_dir="/workspace", permissions=READONLY_RULESET)
Preset Description
DEFAULT_RULESET Allow reads (except secrets), ask for writes/executes
PERMISSIVE_RULESET Allow most operations, deny dangerous commands
READONLY_RULESET Allow reads only, deny all writes and executes
STRICT_RULESET Everything requires approval

Docker Runtimes

Pre-configured environments:

Runtime Base Image Packages
python-minimal python:3.12-slim (none)
python-datascience python:3.12-slim pandas, numpy, matplotlib, scikit-learn
python-web python:3.12-slim fastapi, uvicorn, sqlalchemy, httpx
node-minimal node:20-slim (none)
node-react node:20-slim typescript, vite, react

Custom runtime:

from pydantic_ai_backends import DockerSandbox, RuntimeConfig

runtime = RuntimeConfig(
    name="ml-env",
    base_image="python:3.12-slim",
    packages=["torch", "transformers"],
)
sandbox = DockerSandbox(runtime=runtime)

Session Manager

Multi-user web applications:

from pydantic_ai_backends import SessionManager

manager = SessionManager(
    default_runtime="python-datascience",
    workspace_root="/app/workspaces",
)

# Each user gets isolated sandbox
sandbox = await manager.get_or_create("user-123")

Why Choose This Library?

Feature Description
Multiple Backends In-memory, filesystem, Docker — same interface
Console Toolset Ready-to-use tools for pydantic-ai agents
Permission System Pattern-based access control with presets
Docker Isolation Safe execution of untrusted code
Session Management Multi-user support with workspace persistence
Image Support Multimodal models can see images via BinaryContent
Pre-built Runtimes Python and Node.js environments ready to go

Related Projects

Package Description
Pydantic Deep Agents Full agent framework (uses this library)
pydantic-ai-todo Task planning toolset
subagents-pydantic-ai Multi-agent orchestration
summarization-pydantic-ai Context management
pydantic-ai The foundation — agent framework by Pydantic

Contributing

git clone https://github.com/vstorm-co/pydantic-ai-backend.git
cd pydantic-ai-backend
make install
make test  # 100% coverage required

License

MIT — see LICENSE


Need help implementing this in your company?

We're Vstorm — an Applied Agentic AI Engineering Consultancy
with 30+ production AI agent implementations.

Talk to us



Made with ❤️ by Vstorm

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pydantic_ai_backend-0.2.2.tar.gz (13.8 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pydantic_ai_backend-0.2.2-py3-none-any.whl (57.6 kB view details)

Uploaded Python 3

File details

Details for the file pydantic_ai_backend-0.2.2.tar.gz.

File metadata

  • Download URL: pydantic_ai_backend-0.2.2.tar.gz
  • Upload date:
  • Size: 13.8 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for pydantic_ai_backend-0.2.2.tar.gz
Algorithm Hash digest
SHA256 c07e9590066bb4b6fa0fad3350b4b0da685dbfde4ccea55c3994e9aa067093ea
MD5 a68a1a1d7ff741f1c53e7b8dcdb324cb
BLAKE2b-256 347d67038690af9bdc882ea771dea9492369a8dc8131b8b5b1977b19e59d6862

See more details on using hashes here.

Provenance

The following attestation bundles were made for pydantic_ai_backend-0.2.2.tar.gz:

Publisher: publish.yml on vstorm-co/pydantic-ai-backend

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file pydantic_ai_backend-0.2.2-py3-none-any.whl.

File metadata

File hashes

Hashes for pydantic_ai_backend-0.2.2-py3-none-any.whl
Algorithm Hash digest
SHA256 7025c8517392190ad6ad9381f4d954d44a25d1643f06f055d3853fb486d5a203
MD5 411efe83cfd53efe4aefda028da37fd2
BLAKE2b-256 a666020ca3caedc570c2b393a81c20a9961c8ad53005dd80ae96130f7481a759

See more details on using hashes here.

Provenance

The following attestation bundles were made for pydantic_ai_backend-0.2.2-py3-none-any.whl:

Publisher: publish.yml on vstorm-co/pydantic-ai-backend

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page