Skip to main content

AWS Bedrock AgentCore implementation for ai-agentswarm.

Project description

agentswarm-bedrock-agentcore

An implementation of the AWS Bedrock AgentCore SDK for the ai-agentswarm framework.

This library eliminates boilerplate to create and execute agents in the Bedrock infrastructure.

Installation

pip install agentswarm-bedrock-agentcore

Usage

1. Defining and Hosting your Agent

Extend BedrockAgent to create your agent. You can then use the .serve() method to start a Bedrock AgentCore compatible WebSocket server.

from agentswarm.bedrock_agentcore import BedrockAgent
from agentswarm.datamodels import Context
from agentswarm.llms import GoogleGenAI

# Initialize your LLM
llm = GoogleGenAI(model_name="gemini-1.5-pro")

class MyBedrockAgent(BedrockAgent):
    def id(self) -> str:
        return "my-awesome-agent"

    async def execute(self, user_id: str, context: Context, input: str = None):
        # Your custom agent logic here
        # The context will have the default_llm if passed to .serve()
        return f"Bedrock Agent says: I processed '{input}'"

if __name__ == "__main__":
    # Start the server and pass the default_llm
    MyBedrockAgent().serve(port=8000, default_llm=llm)

2. Invoking your Agent Remotely

Use BedrockRemoteAgent to call an agent that is already running. It supports both local WebSocket endpoints and AWS Bedrock ARNs.

from agentswarm.bedrock_agentcore import BedrockRemoteAgent
from agentswarm.datamodels import Context

# Use an ARN for cloud invocation (after deployment)
# or "http://localhost:8000" for local testing
AGENT_ENDPOINT = "arn:aws:bedrock-agentcore:us-west-2:123456789012:runtime/my-agent-abc"

# Create a proxy for the remote agent
remote_agent = BedrockRemoteAgent(
    endpoint_url=AGENT_ENDPOINT,
    remote_agent_id="my-awesome-agent"
)

# Use it like any other AgentSwarm agent
# Note: Provide required context arguments for initialization
result = await remote_agent.execute(
    user_id="user-123",
    context=Context(trace_id="trace-1", messages=[], store=None, tracing=None),
    input="Hello Bedrock!"
)

print(result)

Deployment

To deploy your agent natively to the Amazon Bedrock AgentCore Runtime, use the official Bedrock AgentCore Starter Toolkit.

1. Install the Toolkit

pip install bedrock-agentcore-starter-toolkit

2. Configure your Agent

Run the interactive configuration tool to set up your deployment (entrypoint, region, runtime, etc.).

agentcore configure

This will create or update a .bedrock_agentcore.yaml file with your settings.

3. Launch to AWS Bedrock

The launch command packages your code, installs dependencies from requirements.txt, and deploys it to the managed Bedrock runtime.

# Recommended: Cloud-based deployment (Direct Code Deploy)
agentcore launch

This command will return an Agent ARN which you can then use to invoke your agent via BedrockRemoteAgent.

Configuration

requirements.txt

Ensure your requirements.txt includes the necessary libraries for the remote runtime:

ai-agentswarm>=0.5.1
agentswarm-bedrock-agentcore>=0.1.0

default_llm Support

You can configure the model via environment variables in your .bedrock_agentcore.yaml:

env:
  DEFAULT_LLM_MODEL: gemini-2.5-flash

Quick Start: Testing and Deployment

1. Install Dependencies

# Core logic and Bedrock implementation
pip install ai-agentswarm agentswarm-bedrock-agentcore

# Deployment Toolkit
pip install bedrock-agentcore-starter-toolkit

2. Local Testing

Verify the bridge before deploying.

Server:

export RUN_SERVER=true
python examples/bedrock_demo.py

Client (Proxy):

# In another terminal
python examples/bedrock_demo.py

3. Native Bedrock Deployment

  1. export UV_CACHE_DIR=./.uv_cache (optional, for permissions fix)
  2. agentcore configure
  3. agentcore launch

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

agentswarm_bedrock_agentcore-0.2.0.tar.gz (9.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

agentswarm_bedrock_agentcore-0.2.0-py3-none-any.whl (9.3 kB view details)

Uploaded Python 3

File details

Details for the file agentswarm_bedrock_agentcore-0.2.0.tar.gz.

File metadata

File hashes

Hashes for agentswarm_bedrock_agentcore-0.2.0.tar.gz
Algorithm Hash digest
SHA256 0d35f32bfbfde893b2ffe60e1a9726a1e75ed5d52aa120f4c25f3dfa926feff2
MD5 4f5f628f3d916177ed920e47b4dc2e43
BLAKE2b-256 a87476519466859b9586655ab089becdb4837961b97274a7e37d4898ee3c6ebe

See more details on using hashes here.

Provenance

The following attestation bundles were made for agentswarm_bedrock_agentcore-0.2.0.tar.gz:

Publisher: publish.yml on ai-agentswarm/agentswarm-bedrock-agentcore

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file agentswarm_bedrock_agentcore-0.2.0-py3-none-any.whl.

File metadata

File hashes

Hashes for agentswarm_bedrock_agentcore-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 16a999bce0d09f4f339d064f7c1f0c601a083a9c7ccf40c9df43effc36066aab
MD5 bd27792410c5b7c348af01ac713dc720
BLAKE2b-256 816d49560fe613b959aa937e993bd776728eb02fb3445150f03494b21082dc17

See more details on using hashes here.

Provenance

The following attestation bundles were made for agentswarm_bedrock_agentcore-0.2.0-py3-none-any.whl:

Publisher: publish.yml on ai-agentswarm/agentswarm-bedrock-agentcore

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page