Skip to main content

AWS Bedrock AgentCore implementation for ai-agentswarm.

Project description

agentswarm-bedrock-agentcore

An implementation of the AWS Bedrock AgentCore SDK for the ai-agentswarm framework.

This library eliminates boilerplate to create and execute agents in the Bedrock infrastructure.

Installation

pip install agentswarm-bedrock-agentcore

Usage

1. Defining and Hosting your Agent

Extend BedrockAgent to create your agent. You can then use the .serve() method to start a Bedrock AgentCore compatible WebSocket server.

from agentswarm.bedrock_agentcore import BedrockAgent
from agentswarm.datamodels import Context
from agentswarm.llms import GoogleGenAI

# Initialize your LLM
llm = GoogleGenAI(model_name="gemini-1.5-pro")

class MyBedrockAgent(BedrockAgent):
    def id(self) -> str:
        return "my-awesome-agent"

    async def execute(self, user_id: str, context: Context, input: str = None):
        # Your custom agent logic here
        # The context will have the default_llm if passed to .serve()
        return f"Bedrock Agent says: I processed '{input}'"

if __name__ == "__main__":
    # Start the server and pass the default_llm
    MyBedrockAgent().serve(port=8000, default_llm=llm)

2. Invoking your Agent Remotely

Use BedrockRemoteAgent to call an agent that is already running. It supports both local WebSocket endpoints and AWS Bedrock ARNs.

from agentswarm.bedrock_agentcore import BedrockRemoteAgent
from agentswarm.datamodels import Context

# Use an ARN for cloud invocation (after deployment)
# or "http://localhost:8000" for local testing
AGENT_ENDPOINT = "arn:aws:bedrock-agentcore:us-west-2:123456789012:runtime/my-agent-abc"

# Create a proxy for the remote agent
remote_agent = BedrockRemoteAgent(
    endpoint_url=AGENT_ENDPOINT,
    remote_agent_id="my-awesome-agent"
)

# Use it like any other AgentSwarm agent
# Note: Provide required context arguments for initialization
result = await remote_agent.execute(
    user_id="user-123",
    context=Context(trace_id="trace-1", messages=[], store=None, tracing=None),
    input="Hello Bedrock!"
)

print(result)

Deployment

To deploy your agent natively to the Amazon Bedrock AgentCore Runtime, use the official Bedrock AgentCore Starter Toolkit.

1. Install the Toolkit

pip install bedrock-agentcore-starter-toolkit

2. Configure your Agent

Run the interactive configuration tool to set up your deployment (entrypoint, region, runtime, etc.).

agentcore configure

This will create or update a .bedrock_agentcore.yaml file with your settings.

3. Launch to AWS Bedrock

The launch command packages your code, installs dependencies from requirements.txt, and deploys it to the managed Bedrock runtime.

# Recommended: Cloud-based deployment (Direct Code Deploy)
agentcore launch

This command will return an Agent ARN which you can then use to invoke your agent via BedrockRemoteAgent.

Configuration

requirements.txt

Ensure your requirements.txt includes the necessary libraries for the remote runtime:

ai-agentswarm>=0.5.1
agentswarm-bedrock-agentcore>=0.1.0

default_llm Support

You can configure the model via environment variables in your .bedrock_agentcore.yaml:

env:
  DEFAULT_LLM_MODEL: gemini-2.5-flash

Quick Start: Testing and Deployment

1. Install Dependencies

# Core logic and Bedrock implementation
pip install ai-agentswarm agentswarm-bedrock-agentcore

# Deployment Toolkit
pip install bedrock-agentcore-starter-toolkit

2. Local Testing

Verify the bridge before deploying.

Server:

export RUN_SERVER=true
python examples/bedrock_demo.py

Client (Proxy):

# In another terminal
python examples/bedrock_demo.py

3. Native Bedrock Deployment

  1. export UV_CACHE_DIR=./.uv_cache (optional, for permissions fix)
  2. agentcore configure
  3. agentcore launch

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

agentswarm_bedrock_agentcore-0.3.2.tar.gz (9.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

agentswarm_bedrock_agentcore-0.3.2-py3-none-any.whl (9.7 kB view details)

Uploaded Python 3

File details

Details for the file agentswarm_bedrock_agentcore-0.3.2.tar.gz.

File metadata

File hashes

Hashes for agentswarm_bedrock_agentcore-0.3.2.tar.gz
Algorithm Hash digest
SHA256 2a14fdcbdee027fda9b55f64bbda5e65033d639241ceae752353b4646eef24a8
MD5 2fc427f5ed86a80b66cebb6a8eefa5c9
BLAKE2b-256 a5beef8d09a2bfa040a1c57d271922674816f2c9ed525ccca610ad24d779ed0b

See more details on using hashes here.

Provenance

The following attestation bundles were made for agentswarm_bedrock_agentcore-0.3.2.tar.gz:

Publisher: publish.yml on ai-agentswarm/agentswarm-bedrock-agentcore

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file agentswarm_bedrock_agentcore-0.3.2-py3-none-any.whl.

File metadata

File hashes

Hashes for agentswarm_bedrock_agentcore-0.3.2-py3-none-any.whl
Algorithm Hash digest
SHA256 3e2bfb58c2f095e2d269c903df52b3b532426afb6a1f7f54618e6d0d2ae27a0f
MD5 ec50cda823a7ada54fe7b9a78a54c8bf
BLAKE2b-256 d7877d06dcfe51fb0b3581fefce9e914d497c8ba1f007607b8f1f9afbdd8f364

See more details on using hashes here.

Provenance

The following attestation bundles were made for agentswarm_bedrock_agentcore-0.3.2-py3-none-any.whl:

Publisher: publish.yml on ai-agentswarm/agentswarm-bedrock-agentcore

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page