Skip to main content

MCP server for AWS S3 storage access

Project description

AWS S3 MCP Server

Amazon S3 storage integration for AI assistants via Model Context Protocol (MCP)

Python 3.10+ License: MIT

Developed and maintained by Arclio - Secure MCP service management for AI applications


🚀 Quick Start

Test the server immediately using the Model Context Protocol (MCP) Inspector, or install and run it directly.

Option 1: Instant Setup with MCP Inspector (Recommended for Testing)

# Test with MCP Inspector
npx @modelcontextprotocol/inspector \
  -e AWS_ACCESS_KEY_ID="your-aws-access-key" \
  -e AWS_SECRET_ACCESS_KEY="your-aws-secret-key" \
  -e AWS_REGION="us-east-1" \
  -e S3_BUCKETS="your-bucket-name" \
  -- \
  uvx --from aws-s3-mcp aws-s3-mcp

Replace the environment variables with your actual AWS credentials and bucket names.

Option 2: Direct Installation & Usage

  1. Install the package:

    pip install aws-s3-mcp
    
  2. Set Environment Variables:

    export AWS_ACCESS_KEY_ID="your-aws-access-key"
    export AWS_SECRET_ACCESS_KEY="your-aws-secret-key"
    export AWS_REGION="us-east-1"
    export S3_BUCKETS="your-bucket-name"
    
  3. Run the MCP Server:

    python -m aws_s3_mcp
    

Option 3: Using uvx (Run without full installation)

# Ensure AWS_* environment variables are set as shown above
uvx --from aws-s3-mcp aws-s3-mcp

📋 Overview

aws-s3-mcp is a Python package that enables AI models to interact with Amazon S3 storage through the Model Context Protocol (MCP). It acts as a secure and standardized bridge, allowing AI assistants to leverage S3's powerful object storage capabilities without direct credential exposure.

What is MCP?

The Model Context Protocol (MCP) provides a standardized interface for AI models to discover and utilize external tools and services. This package implements an MCP server that exposes S3 capabilities as discrete, callable "tools."

Key Benefits

  • AI-Ready Integration: Purpose-built for AI assistants to naturally interact with S3 storage.
  • Standardized Protocol: Ensures seamless integration with MCP-compatible AI systems and hubs.
  • Enhanced Security: AWS credentials remain on the server, isolated from the AI models.
  • Fully Asynchronous: Uses aioboto3 exclusively for non-blocking I/O operations.
  • Smart Content Detection: Automatically differentiates between text and binary files.
  • Robust Error Handling: Comprehensive error reporting with actionable messages.
  • Configurable Access: Support for bucket filtering and access controls.

🏗️ Prerequisites & Setup

Step 1: AWS Credentials Setup

You need valid AWS credentials with S3 access permissions:

Option A: AWS Access Keys (Recommended for Development)

  1. Get your AWS credentials:

    • Sign in to the AWS Management Console
    • Navigate to IAM → Users → Your User → Security credentials
    • Create access key if you don't have one
  2. Set required permissions: Ensure your AWS user/role has the following S3 permissions:

    {
      "Version": "2012-10-17",
      "Statement": [
        {
          "Effect": "Allow",
          "Action": [
            "s3:ListBucket",
            "s3:GetObject"
          ],
          "Resource": [
            "arn:aws:s3:::your-bucket-name",
            "arn:aws:s3:::your-bucket-name/*"
          ]
        }
      ]
    }
    

Option B: AWS CLI Configuration

If you have AWS CLI configured, the server will automatically use those credentials:

aws configure

Step 2: S3 Bucket Access

Ensure you have at least one S3 bucket with objects you want to access. The server can be configured to access specific buckets or all accessible buckets.

⚙️ Configuration

Environment Variables

The MCP server requires the following environment variables:

# Essential AWS credentials
export AWS_ACCESS_KEY_ID="AKIAIOSFODNN7EXAMPLE"        # Your AWS access key
export AWS_SECRET_ACCESS_KEY="wJalrXUtnFEMI/K7MDENG..."  # Your AWS secret key
export AWS_REGION="us-east-1"                           # AWS region

# Optional S3 configuration
export S3_BUCKETS="bucket1,bucket2,bucket3"             # Comma-separated list of allowed buckets
export S3_MAX_BUCKETS="5"                               # Maximum buckets to list (default: 5)
export S3_OBJECT_MAX_KEYS="1000"                        # Maximum objects per request (default: 1000)

Configuration File

For persistent configuration, create a .env file:

# Copy the example file (if available)
cp .env.example .env

# Or create your own .env file
cat > .env << EOF
AWS_ACCESS_KEY_ID=your-access-key-here
AWS_SECRET_ACCESS_KEY=your-secret-key-here
AWS_REGION=us-east-1
S3_BUCKETS=my-documents,my-images
S3_MAX_BUCKETS=10
S3_OBJECT_MAX_KEYS=500
EOF

🛠️ Exposed Capabilities (Tools)

This package exposes comprehensive tools for AI interaction with Amazon S3.

Object Listing Tools

  • s3_list_objects: List objects within a specified S3 bucket with optional prefix filtering

Content Retrieval Tools

  • s3_get_object_content: Retrieve content from S3 objects with automatic text/binary detection and proper encoding

🔍 Troubleshooting

Connection Issues

  • "AWS credentials not found": Ensure AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY are set

    echo $AWS_ACCESS_KEY_ID
    echo $AWS_SECRET_ACCESS_KEY
    
  • "Access Denied": Check your AWS IAM permissions for S3 access

  • "Bucket not found": Verify the bucket name and region are correct

Configuration Issues

  • "InvalidAccessKeyId": Verify your AWS access key is correct and active
  • "SignatureDoesNotMatch": Check that your AWS secret key is correct
  • "Bucket not in configured list": Add the bucket to your S3_BUCKETS environment variable

MCP Server Issues

  • "Tool not found": Verify the tool name matches exactly (case-sensitive)
  • "Invalid arguments": Check the tool's parameter requirements and types
  • "Server not responding": Check server logs for error messages

For detailed debugging, inspect the server's stdout/stderr logs.

📚 Usage Examples

Listing S3 Objects

# List all objects in a bucket
result = await s3_list_objects(bucket_name="my-documents")

# List objects with prefix filter
result = await s3_list_objects(
    bucket_name="my-documents",
    prefix="reports/2024/",
    max_keys=50
)

# Example response:
{
  "count": 2,
  "objects": [
    {
      "key": "reports/2024/q1-report.pdf",
      "last_modified": "2024-03-20T10:00:00Z",
      "size": 123456,
      "etag": "abc1234"
    },
    {
      "key": "reports/2024/q1-summary.md",
      "last_modified": "2024-03-21T14:30:00Z",
      "size": 5678,
      "etag": "def5678"
    }
  ]
}

Retrieving Object Content

# Get text file content
result = await s3_get_object_content(
    bucket_name="my-documents",
    key="reports/quarterly-report.md"
)
# Returns: {
#   "content": "# Q1 Report\n\nThis quarter...",
#   "mime_type": "text/markdown",
#   "encoding": "utf-8",
#   "size": 1024
# }

# Get binary file content (automatically Base64 encoded)
result = await s3_get_object_content(
    bucket_name="my-documents",
    key="reports/chart.pdf"
)
# Returns: {
#   "content": "JVBERi0xLjQ...",  # Base64 encoded
#   "mime_type": "application/pdf",
#   "encoding": "base64",
#   "size": 51200
# }

Working with Different File Types

The server automatically detects content types and handles encoding:

  • Text files (.txt, .md, .json, .csv, etc.): Returned as UTF-8 strings
  • Binary files (.pdf, .jpg, .zip, etc.): Returned as Base64 encoded strings
  • Unknown types: Analyzed heuristically for text vs binary content

📝 Contributing

Contributions are welcome! Please refer to the main README.md in the arclio-mcp-tooling monorepo for guidelines on contributing, setting up the development environment, and project-wide commands.

Development Setup

# Clone the monorepo
git clone https://github.com/your-org/arclio-mcp-tooling.git
cd arclio-mcp-tooling

# Set up development environment
make setup-dev

# Run tests for AWS S3 MCP
make test tests/aws-s3-mcp

# Run with coverage
make cov tests/aws-s3-mcp

# Lint code
make lint aws-s3-mcp

📄 License

This package is licensed under the MIT License. See the LICENSE file in the monorepo root for full details.

🏢 About Arclio

Arclio provides secure and robust Model Context Protocol (MCP) solutions, enabling AI applications to safely and effectively interact with enterprise systems and external services.


Built with ❤️ by the Arclio team

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aws_s3_mcp-0.1.3.tar.gz (14.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

aws_s3_mcp-0.1.3-py3-none-any.whl (17.2 kB view details)

Uploaded Python 3

File details

Details for the file aws_s3_mcp-0.1.3.tar.gz.

File metadata

  • Download URL: aws_s3_mcp-0.1.3.tar.gz
  • Upload date:
  • Size: 14.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.7.2

File hashes

Hashes for aws_s3_mcp-0.1.3.tar.gz
Algorithm Hash digest
SHA256 0c48bf56e1b4c5e841c8fa53b21350fbb245c49d975977b03d988d552bee62bd
MD5 606a71e19812dcb97c4f44074473c942
BLAKE2b-256 efb35c0f1119b9625a9dc00eaf555d69fbe838027d6671a67a6bd6a197c725f9

See more details on using hashes here.

File details

Details for the file aws_s3_mcp-0.1.3-py3-none-any.whl.

File metadata

File hashes

Hashes for aws_s3_mcp-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 19b1129433862ad7d0782a7cb3c5809cc3c045b6c15d9eb657ff7d131980e55a
MD5 a152b3435a81f0d90a419081c94f1f97
BLAKE2b-256 952cb747b96d4a83fd5ec222e6a7936a9a99227632dbbfd2a48502b908274e84

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page