llama-index Amazon Bedrock AgentCore Runtime and Tools integration
Project description
Amazon Bedrock AgentCore Runtime and Tools
This module provides a runtime adapter and tools for deploying and extending LlamaIndex agents with Amazon Bedrock AgentCore -- including managed compute via AgentCore Runtime, sandboxed browser automation, and code execution.
Prerequisites
- AWS credentials configured via environment variables, AWS CLI profile, or IAM role
- IAM permissions for
bedrock-agentcore:*actions (see the AgentCore documentation for details) - Python 3.9+
Installation
(Optional) To run the examples below, first install:
pip install llama-index llama-index-llms-bedrock-converse
Install the main tools package:
pip install llama-index-tools-aws-bedrock-agentcore
Runtime
The AgentCoreRuntime adapter deploys any LlamaIndex agent to Amazon Bedrock AgentCore Runtime -- a managed compute platform for AI agents. It wraps BedrockAgentCoreApp from the bedrock-agentcore SDK, providing the required POST /invocations and GET /ping endpoints.
Quick Start
from llama_index.llms.bedrock_converse import BedrockConverse
from llama_index.core.agent.workflow import FunctionAgent
from llama_index.tools.aws_bedrock_agentcore import AgentCoreRuntime
llm = BedrockConverse(
model="us.anthropic.claude-sonnet-4-6-v1",
region_name="us-west-2",
)
agent = FunctionAgent(llm=llm, tools=[])
# One-liner -- starts uvicorn on port 8080
AgentCoreRuntime.serve(agent)
With Options
runtime = AgentCoreRuntime(
agent=agent,
stream=True, # SSE streaming (default)
port=8080, # Required port for AgentCore deployment
debug=False,
)
runtime.run()
With AgentCore Memory
from llama_index.memory.bedrock_agentcore import (
AgentCoreMemory,
AgentCoreMemoryContext,
)
memory = AgentCoreMemory(
context=AgentCoreMemoryContext(
memory_id="your-memory-id",
actor_id="user-123",
),
region_name="us-west-2",
)
# Session ID from the X-Amzn-Bedrock-AgentCore-Runtime-Session-Id header
# is automatically wired to memory
AgentCoreRuntime.serve(agent, memory=memory)
Sending Requests
# Non-streaming
curl -X POST http://localhost:8080/invocations \
-H "Content-Type: application/json" \
-d '{"prompt": "Hello, what can you do?"}'
# Streaming (SSE)
curl -N -X POST http://localhost:8080/invocations \
-H "Content-Type: application/json" \
-d '{"prompt": "Hello, what can you do?"}'
The adapter accepts prompt, message, or input as the payload key.
Streaming Event Types
When stream=True (default), the SSE stream emits these event types:
| Event | Fields | Description |
|---|---|---|
agent_stream |
delta, response, thinking_delta? |
Token-by-token LLM output |
tool_call |
tool_name, tool_kwargs |
Before tool execution |
tool_result |
tool_name, tool_output |
After tool execution |
done |
response |
Final agent response |
error |
message |
Error during streaming |
Testing with ASGI
runtime = AgentCoreRuntime(agent=agent)
app = runtime.app # BedrockAgentCoreApp (Starlette-based)
# Use with httpx.AsyncClient for testing
Toolspecs
Browser
The AgentCore Browser toolspec provides a set of tools for interacting with web browsers in a secure sandbox environment. It enables your LlamaIndex agents to navigate websites, extract content, click elements, and more.
Included tools:
navigate_browser: Navigate to a URLclick_element: Click on an element using CSS selectorsextract_text: Extract all text from the current webpageextract_hyperlinks: Extract all hyperlinks from the current webpageget_elements: Get elements matching a CSS selectornavigate_back: Navigate to the previous pagecurrent_webpage: Get information about the current webpagegenerate_live_view_url: Generate a presigned URL for human oversight of a browser sessiontake_control: Take manual control of a browser session (disables automation)release_control: Release manual control (re-enables automation)
Lifecycle methods available for programmatic use (not exposed as agent tools):
list_browsers,create_browser,delete_browser,get_browser
You can optionally pass a custom identifier for VPC-enabled browser resources:
tool_spec = AgentCoreBrowserToolSpec(
region="us-west-2",
identifier="my-custom-browser-id",
)
Example usage:
import asyncio
from llama_index.llms.bedrock_converse import BedrockConverse
from llama_index.tools.aws_bedrock_agentcore import AgentCoreBrowserToolSpec
from llama_index.core.agent.workflow import FunctionAgent
import nest_asyncio
nest_asyncio.apply() # In case of existing loop (ex. in JupyterLab)
async def main():
tool_spec = AgentCoreBrowserToolSpec(region="us-west-2")
tools = tool_spec.to_tool_list()
llm = BedrockConverse(
model="us.anthropic.claude-sonnet-4-6-v1",
region_name="us-west-2",
)
agent = FunctionAgent(
tools=tools,
llm=llm,
)
task = "Go to https://news.ycombinator.com/ and tell me the titles of the top 5 posts."
response = await agent.run(task)
print(str(response))
await tool_spec.cleanup()
if __name__ == "__main__":
asyncio.run(main())
Code Interpreter
The AgentCore Code Interpreter toolspec provides a set of tools for interacting with a secure code interpreter sandbox environment. It enables your LlamaIndex agents to execute code, run shell commands, manage files, and perform computational tasks.
Included tools:
execute_code: Run code in various languages (primarily Python)execute_command: Run shell commandsread_files: Read content of files in the environmentlist_files: List files in directoriesdelete_files: Remove files from the environmentwrite_files: Create or update filesstart_command: Start long-running commands asynchronouslyget_task: Check status of async tasksstop_task: Stop running tasksupload_file: Upload a file with an optional semantic descriptionupload_files: Upload multiple files at onceinstall_packages: Install Python packages via pipdownload_file: Download a file from the sandboxdownload_files: Download multiple files from the sandboxclear_context: Clear all variable state in the Python execution context
Lifecycle methods available for programmatic use (not exposed as agent tools):
list_code_interpreters,create_code_interpreter,delete_code_interpreter,get_code_interpreter
You can optionally pass a custom identifier for VPC-enabled code interpreter resources:
tool_spec = AgentCoreCodeInterpreterToolSpec(
region="us-west-2",
identifier="my-custom-interpreter-id",
)
Example usage:
import asyncio
from llama_index.llms.bedrock_converse import BedrockConverse
from llama_index.tools.aws_bedrock_agentcore import (
AgentCoreCodeInterpreterToolSpec,
)
from llama_index.core.agent.workflow import FunctionAgent
import nest_asyncio
nest_asyncio.apply() # In case of existing loop (ex. in JupyterLab)
async def main():
tool_spec = AgentCoreCodeInterpreterToolSpec(region="us-west-2")
tools = tool_spec.to_tool_list()
llm = BedrockConverse(
model="us.anthropic.claude-sonnet-4-6-v1",
region_name="us-west-2",
)
agent = FunctionAgent(
tools=tools,
llm=llm,
)
code_task = "Write a Python function that calculates the factorial of a number and test it."
code_response = await agent.run(code_task)
print(str(code_response))
command_task = "Use terminal CLI commands to: 1) Show the environment's Python version. 2) Show me the list of Python package currently installed in the environment."
command_response = await agent.run(command_task)
print(str(command_response))
await tool_spec.cleanup()
if __name__ == "__main__":
asyncio.run(main())
Example Notebooks
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file llama_index_tools_aws_bedrock_agentcore-0.3.0.tar.gz.
File metadata
- Download URL: llama_index_tools_aws_bedrock_agentcore-0.3.0.tar.gz
- Upload date:
- Size: 19.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.10.11 {"installer":{"name":"uv","version":"0.10.11","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
08b71a23b77b52cf57403345ae32545dbcbd1cbd8e38e82f67eae7669e5415cb
|
|
| MD5 |
3efd3614c07e8d4f89f0db10d8a5ab37
|
|
| BLAKE2b-256 |
14454b2cb50aff4b6b4c826b57095c5a70fb8f8be97f1d8f649b1262d7dd7b09
|
File details
Details for the file llama_index_tools_aws_bedrock_agentcore-0.3.0-py3-none-any.whl.
File metadata
- Download URL: llama_index_tools_aws_bedrock_agentcore-0.3.0-py3-none-any.whl
- Upload date:
- Size: 23.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.10.11 {"installer":{"name":"uv","version":"0.10.11","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
bd7445664d72a4b9f53dacc26763f03dffe74a9c3e56de6b0a6b71161e02ccd8
|
|
| MD5 |
df9472eee98f50218e309c75bbdba713
|
|
| BLAKE2b-256 |
cdfb985bc23fe24be92c1766867312600c88925e056fa9e38e25bc8de03e3704
|