Skip to main content

An AWS Labs Model Context Protocol (MCP) server for Bedrock Data Automation

This project has been archived.

The maintainers of this project have marked this project as archived. No new releases are expected.

Project description

AWS Bedrock Data Automation MCP Server

⚠️ DEPRECATION NOTICE: This server is deprecated and will no longer receive updates. For Bedrock Data Automation capabilities, use the boto3 API directly or the aws-api-mcp-server. See the migration guide for details.

A Model Context Protocol (MCP) server for Amazon Bedrock Data Automation that enables AI assistants to analyze documents, images, videos, and audio files using Amazon Bedrock Data Automation projects.

Features

  • Project Management: List and get details about Bedrock Data Automation projects
  • Asset Analysis: Extract insights from unstructured content using Bedrock Data Automation
  • Support for Multiple Content Types: Process documents, images, videos, and audio files
  • Integration with Amazon S3: Seamlessly upload and download assets and results

Prerequisites

  1. Install uv from Astral or the GitHub README
  2. Install Python using uv python install 3.10
  3. Set up AWS credentials with access to Amazon Bedrock Data Automation
    • You need an AWS account with Amazon Bedrock Data Automation enabled
    • Configure AWS credentials with aws configure or environment variables
    • Ensure your IAM role/user has permissions to use Amazon Bedrock Data Automation
  4. Create an AWS S3 Bucket
    • Example AWS CLI command to create the bucket
    •  aws s3 create-bucket <bucket-name>
      

Installation

Kiro Cursor VS Code
Add to Kiro Install MCP Server Install on VS Code

Configure the MCP server in your MCP client configuration (e.g., for Kiro, edit ~/.kiro/settings/mcp.json):

{
  "mcpServers": {
    "bedrock-data-automation-mcp-server": {
      "command": "uvx",
      "args": ["awslabs.aws-bedrock-data-automation-mcp-server@latest"],
      "env": {
        "AWS_PROFILE": "your-aws-profile",
        "AWS_REGION": "us-east-1",
        "AWS_BUCKET_NAME": "your-s3-bucket-name",
        "BASE_DIR": "/path/to/base/directory",
        "FASTMCP_LOG_LEVEL": "ERROR"
      },
      "disabled": false,
      "autoApprove": []
    }
  }
}

Windows Installation

For Windows users, the MCP server configuration format is slightly different:

{
  "mcpServers": {
    "awslabs.aws-bedrock-data-automation-mcp-server": {
      "disabled": false,
      "timeout": 60,
      "type": "stdio",
      "command": "uv",
      "args": [
        "tool",
        "run",
        "--from",
        "awslabs.aws-bedrock-data-automation-mcp-server@latest",
        "awslabs.aws-bedrock-data-automation-mcp-server.exe"
      ],
      "env": {
        "FASTMCP_LOG_LEVEL": "ERROR",
        "AWS_PROFILE": "your-aws-profile",
        "AWS_REGION": "us-east-1"
      }
    }
  }
}

or docker after a successful docker build -t awslabs/aws-bedrock-data-automation-mcp-server .:

# fictitious `.env` file with AWS temporary credentials
AWS_ACCESS_KEY_ID=<from the profile you set up>
AWS_SECRET_ACCESS_KEY=<from the profile you set up>
AWS_SESSION_TOKEN=<from the profile you set up>
AWS_REGION=<your-region>
AWS_BUCKET_NAME=<your-s3-bucket-name>
BASE_DIR=/path/to/base/directory
{
  "mcpServers": {
    "bedrock-data-automation-mcp-server": {
      "command": "docker",
      "args": [
        "run",
        "--rm",
        "--interactive",
        "--env-file",
        "/full/path/to/file/above/.env",
        "awslabs/aws-bedrock-data-automation-mcp-server:latest"
      ],
      "env": {},
      "disabled": false,
      "autoApprove": []
    }
  }
}

NOTE: Your credentials will need to be kept refreshed from your host

Environment Variables

  • AWS_PROFILE: AWS CLI profile to use for credentials
  • AWS_REGION: AWS region to use (default: us-east-1)
  • AWS_BUCKET_NAME: S3 bucket name for storing assets and results
  • BASE_DIR: Base directory for file operations (optional)
  • FASTMCP_LOG_LEVEL: Logging level (ERROR, WARNING, INFO, DEBUG)

AWS Authentication

The server uses the AWS profile specified in the AWS_PROFILE environment variable. If not provided, it defaults to the default credential provider chain.

"env": {
  "AWS_PROFILE": "your-aws-profile",
  "AWS_REGION": "us-east-1"
}

Make sure the AWS profile has permissions to access Amazon Bedrock Data Automation services. The MCP server creates a boto3 session using the specified profile to authenticate with AWS services. Amazon Bedrock Data Automation services is currently available in the following regions: us-east-1 and us-west-2.

Tools

getprojects

Get a list of data automation projects.

getprojects() -> list

Returns a list of available Bedrock Data Automation projects.

getprojectdetails

Get details of a specific data automation project.

getprojectdetails(projectArn: str) -> dict

Returns detailed information about a specific Bedrock Data Automation project.

analyzeasset

Analyze an asset using a data automation project.

analyzeasset(assetPath: str, projectArn: Optional[str] = None) -> dict

Extracts insights from unstructured content (documents, images, videos, audio) using Amazon Bedrock Data Automation.

  • assetPath: Path to the asset file to analyze
  • projectArn: ARN of the Bedrock Data Automation project to use (optional, uses default public project if not provided)

Example Usage

# List available projects
projects = await getprojects()

# Get details of a specific project
project_details = await getprojectdetails(projectArn="arn:aws:bedrock:us-east-1:123456789012:data-automation-project/my-project")

# Analyze a document
results = await analyzeasset(assetPath="/path/to/document.pdf")

# Analyze an image with a specific project
results = await analyzeasset(
    assetPath="/path/to/image.jpg",
    projectArn="arn:aws:bedrock:us-east-1:123456789012:data-automation-project/my-project"
)

Security Considerations

  • Use AWS IAM roles with appropriate permissions
  • Store credentials securely
  • Use temporary credentials when possible
  • Ensure S3 bucket permissions are properly configured

License

This project is licensed under the Apache License, Version 2.0. See the LICENSE file for details.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file awslabs_aws_bedrock_data_automation_mcp_server-0.0.19.tar.gz.

File metadata

File hashes

Hashes for awslabs_aws_bedrock_data_automation_mcp_server-0.0.19.tar.gz
Algorithm Hash digest
SHA256 6cff9e42ece21604439a986becc88af6358faa2e4bf47efa507ee3c4a7afb696
MD5 cfebaf24afb4e801cfed1dc89d2638c0
BLAKE2b-256 9fb3de6c5fd5f87a7dc1df2bc08e771fb97fd66542777771546c253559efa588

See more details on using hashes here.

Provenance

The following attestation bundles were made for awslabs_aws_bedrock_data_automation_mcp_server-0.0.19.tar.gz:

Publisher: release.yml on awslabs/mcp

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file awslabs_aws_bedrock_data_automation_mcp_server-0.0.19-py3-none-any.whl.

File metadata

File hashes

Hashes for awslabs_aws_bedrock_data_automation_mcp_server-0.0.19-py3-none-any.whl
Algorithm Hash digest
SHA256 5ba0e2c34b76a57be2fd37f19e5d6bd212cacab29c1a94a40715a70813f95d80
MD5 f66c060f4abd6e06b8d7aa46fd34ee10
BLAKE2b-256 b448aa503e6fc816a5dfa49dec5b2493d44c8d51ec3def90e93aa071053a3657

See more details on using hashes here.

Provenance

The following attestation bundles were made for awslabs_aws_bedrock_data_automation_mcp_server-0.0.19-py3-none-any.whl:

Publisher: release.yml on awslabs/mcp

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page