Skip to main content

AWS contributions for Deep Agents

Project description

deepagents-contrib-aws

AWS backend implementations for the deepagents framework.

Backends

S3Backend

An S3-backed implementation of the BackendProtocol that persists agent workspace files in Amazon S3. Supports all eight protocol methods: ls, read, write, edit, grep, glob, upload_files, download_files.

Installation

pip install deepagents-contrib-aws

Quick Start

From constructor

Minimal — S3 client is created implicitly; region and credentials are picked up from environment variables or ~/.aws/config:

from deepagents_contrib_aws import S3Backend

backend = S3Backend(bucket="my-bucket", prefix="agent/workspace/")

Explicit — pass a pre-configured boto3 client (useful for custom endpoints like LocalStack, S3-compatible storage, or reusing an existing session):

import boto3
from deepagents_contrib_aws import S3Backend

client = boto3.client("s3", region_name="us-west-2")
backend = S3Backend(
    bucket="my-bucket",
    prefix="agent/workspace/",
    client=client,
)

From environment variables

export S3_BACKEND_BUCKET=my-bucket
export S3_BACKEND_PREFIX=agent/workspace/
export AWS_REGION=us-west-2  # or AWS_DEFAULT_REGION
from deepagents_contrib_aws import S3Backend

backend = S3Backend.from_env()

With deepagents

from deepagents import create_deep_agent
from deepagents_contrib_aws import S3Backend

backend = S3Backend.from_env()
agent = create_deep_agent(backend=backend)

Basic operations

backend = S3Backend(bucket="my-bucket", prefix="demo/")

# Write a file (errors if file already exists)
result = backend.write("/hello.py", "print('hello')")

# Read it back (with line-based pagination)
result = backend.read("/hello.py")

# Edit it (exact string replacement)
result = backend.edit("/hello.py", "hello", "world")

# List directory
result = backend.ls("/")

# Search file contents (literal text, not regex)
result = backend.grep("world", path="/")

# Find files by pattern
result = backend.glob("*.py")

# Bulk upload
result = backend.upload_files([("/a.txt", b"content a"), ("/b.txt", b"content b")])

# Bulk download
result = backend.download_files(["/a.txt", "/b.txt"])

Configuration

Constructor parameters

Parameter Type Default Description
bucket str required S3 bucket name
prefix str "" Key prefix for all objects (e.g. "agent/workspace/")
client boto3 S3 client None Optional pre-configured boto3 client
region_name str None AWS region for the default client

Environment variables

Variable Required Default Description
S3_BACKEND_BUCKET Yes (for from_env()) -- S3 bucket name
S3_BACKEND_PREFIX No "" Key prefix for all objects
AWS_REGION No -- AWS region (checked first; auto-set by AWS Lambda)
AWS_DEFAULT_REGION No -- AWS region (standard boto3 fallback if AWS_REGION is not set)

AWS credentials are resolved via the standard boto3 credential chain: environment variables (AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_SESSION_TOKEN), shared credentials file (~/.aws/credentials), AWS SSO, or IAM role.

Development

Setup

# Install uv
curl -LsSf https://astral.sh/uv/install.sh | sh

# Create venv and install all dependencies
uv sync

# Lint
uv run ruff check src/ tests/

# Build
uv build

Testing

The project has two test suites: unit tests (mocked S3 via moto) and integration tests (real S3 bucket).

Unit tests

Unit tests use moto to mock all S3 API calls. No AWS credentials or network access required.

uv run pytest

This runs 58 tests covering all 8 protocol methods, path helpers, error mapping, from_env() factory, and edge cases (binary files, empty files, pagination, partial batch failures).

Integration tests

Integration tests run against a real S3 bucket to validate actual AWS behavior (pagination, error codes, eventual consistency). They are skipped by default.

Prerequisites:

  1. An existing S3 bucket with read/write access
  2. AWS credentials configured via any standard method:
    • Environment variables (AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY)
    • Shared credentials file (~/.aws/credentials)
    • AWS SSO (aws sso login)
    • IAM role (EC2, ECS, Lambda)
  3. IAM permissions on the test bucket:
    • s3:GetObject
    • s3:PutObject
    • s3:DeleteObject (for cleanup after tests)
    • s3:ListBucket

Running:

S3_TEST_BUCKET=your-bucket-name uv run pytest -m integration

Optionally set the region if the bucket is not in us-east-1:

S3_TEST_BUCKET=your-bucket-name AWS_DEFAULT_REGION=us-west-2 uv run pytest -m integration

Run both unit and integration tests together:

S3_TEST_BUCKET=your-bucket-name uv run pytest -m ""

Test isolation: Integration tests create objects under a unique prefix (integration-test-<uuid>/) and clean up after themselves. They will not leave artifacts in your bucket.

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

deepagents_contrib_aws-0.1.2.tar.gz (121.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

deepagents_contrib_aws-0.1.2-py3-none-any.whl (7.4 kB view details)

Uploaded Python 3

File details

Details for the file deepagents_contrib_aws-0.1.2.tar.gz.

File metadata

File hashes

Hashes for deepagents_contrib_aws-0.1.2.tar.gz
Algorithm Hash digest
SHA256 1223343fbae008439e19126c3b2f582383058745e4d64a11f6a3ffd6ceb3a41e
MD5 14d25c2f8e9c950a368582bc44ffb73b
BLAKE2b-256 0c7b3e3c6eb579530e95d603103987224d6962f4bfdca8593f1a871b4d9582d8

See more details on using hashes here.

File details

Details for the file deepagents_contrib_aws-0.1.2-py3-none-any.whl.

File metadata

File hashes

Hashes for deepagents_contrib_aws-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 5b26905e50e32f28d6082cf6864aa0bf2b084d4c1784d54ef79fda6594083d0b
MD5 2537fa7aac51fa7266873bd100ede437
BLAKE2b-256 1982faceb98473ba73d0047ac21ede58e7486724d2a9071ff660b62ae4431042

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page