A model-driven approach to building AI agents in just a few lines of code
Project description
Strands Agents
A model-driven approach to building AI agents in just a few lines of code.
Documentation ◆ Samples ◆ Python SDK ◆ Tools ◆ Agent Builder ◆ MCP Server
Strands Agents is a simple yet powerful SDK that takes a model-driven approach to building and running AI agents. From simple conversational assistants to complex autonomous workflows, from local development to production deployment, Strands Agents scales with your needs.
Feature Overview
- Lightweight & Flexible: Simple agent loop that just works and is fully customizable
- Model Agnostic: Support for Amazon Bedrock, Anthropic, Gemini, LiteLLM, Llama, Ollama, OpenAI, Writer, and custom providers
- Advanced Capabilities: Multi-agent systems, autonomous agents, and streaming support
- Built-in MCP: Native support for Model Context Protocol (MCP) servers, enabling access to thousands of pre-built tools
Quick Start
# Install Strands Agents
pip install strands-agents strands-agents-tools
from strands import Agent
from strands_tools import calculator
agent = Agent(tools=[calculator])
agent("What is the square root of 1764")
Note: For the default Amazon Bedrock model provider, you'll need AWS credentials configured and model access enabled for Claude 4 Sonnet in the us-west-2 region. See the Quickstart Guide for details on configuring other model providers.
Installation
Ensure you have Python 3.10+ installed, then:
# Create and activate virtual environment
python -m venv .venv
source .venv/bin/activate # On Windows use: .venv\Scripts\activate
# Install Strands and tools
pip install strands-agents strands-agents-tools
Features at a Glance
Python-Based Tools
Easily build tools using Python decorators:
from strands import Agent, tool
@tool
def word_count(text: str) -> int:
"""Count words in text.
This docstring is used by the LLM to understand the tool's purpose.
"""
return len(text.split())
agent = Agent(tools=[word_count])
response = agent("How many words are in this sentence?")
Hot Reloading from Directory:
Enable automatic tool loading and reloading from the ./tools/ directory:
from strands import Agent
# Agent will watch ./tools/ directory for changes
agent = Agent(load_tools_from_directory=True)
response = agent("Use any tools you find in the tools directory")
MCP Support
Seamlessly integrate Model Context Protocol (MCP) servers:
from strands import Agent
from strands.tools.mcp import MCPClient
from mcp import stdio_client, StdioServerParameters
aws_docs_client = MCPClient(
lambda: stdio_client(StdioServerParameters(command="uvx", args=["awslabs.aws-documentation-mcp-server@latest"]))
)
with aws_docs_client:
agent = Agent(tools=aws_docs_client.list_tools_sync())
response = agent("Tell me about Amazon Bedrock and how to use it with Python")
Multiple Model Providers
Support for various model providers:
from strands import Agent
from strands.models import BedrockModel
from strands.models.ollama import OllamaModel
from strands.models.llamaapi import LlamaAPIModel
from strands.models.gemini import GeminiModel
from strands.models.llamacpp import LlamaCppModel
# Bedrock
bedrock_model = BedrockModel(
model_id="us.amazon.nova-pro-v1:0",
temperature=0.3,
streaming=True, # Enable/disable streaming
)
agent = Agent(model=bedrock_model)
agent("Tell me about Agentic AI")
# Google Gemini
gemini_model = GeminiModel(
client_args={
"api_key": "your_gemini_api_key",
},
model_id="gemini-2.5-flash",
params={"temperature": 0.7}
)
agent = Agent(model=gemini_model)
agent("Tell me about Agentic AI")
# Ollama
ollama_model = OllamaModel(
host="http://localhost:11434",
model_id="llama3"
)
agent = Agent(model=ollama_model)
agent("Tell me about Agentic AI")
# Llama API
llama_model = LlamaAPIModel(
model_id="Llama-4-Maverick-17B-128E-Instruct-FP8",
)
agent = Agent(model=llama_model)
response = agent("Tell me about Agentic AI")
Built-in providers:
- Amazon Bedrock
- Anthropic
- Gemini
- Cohere
- LiteLLM
- llama.cpp
- LlamaAPI
- MistralAI
- Ollama
- OpenAI
- SageMaker
- Writer
Custom providers can be implemented using Custom Providers
Example tools
Strands offers an optional strands-agents-tools package with pre-built tools for quick experimentation:
from strands import Agent
from strands_tools import calculator
agent = Agent(tools=[calculator])
agent("What is the square root of 1764")
It's also available on GitHub via strands-agents/tools.
Documentation
For detailed guidance & examples, explore our documentation:
Contributing ❤️
We welcome contributions! See our Contributing Guide for details on:
- Reporting bugs & features
- Development setup
- Contributing via Pull Requests
- Code of Conduct
- Reporting of security issues
License
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.
Security
See CONTRIBUTING for more information.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file strands_agents-1.11.0.tar.gz.
File metadata
- Download URL: strands_agents-1.11.0.tar.gz
- Upload date:
- Size: 411.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
52899a2205ac612442f7773deb1e45c7fc6e897d54f8ea70068f2e60879ff25a
|
|
| MD5 |
da2af61659379ae02c7309d8583e58e1
|
|
| BLAKE2b-256 |
67737ea29184c8f4c07cb4e6c6805ced5deef71e9fdf124d62fe38c9bc92fbcc
|
Provenance
The following attestation bundles were made for strands_agents-1.11.0.tar.gz:
Publisher:
pypi-publish-on-release.yml on strands-agents/sdk-python
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
strands_agents-1.11.0.tar.gz -
Subject digest:
52899a2205ac612442f7773deb1e45c7fc6e897d54f8ea70068f2e60879ff25a - Sigstore transparency entry: 592555799
- Sigstore integration time:
-
Permalink:
strands-agents/sdk-python@2a26ffad8bc7379358bc2535d9ce1ec290fea0af -
Branch / Tag:
refs/tags/v1.11.0 - Owner: https://github.com/strands-agents
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
pypi-publish-on-release.yml@2a26ffad8bc7379358bc2535d9ce1ec290fea0af -
Trigger Event:
release
-
Statement type:
File details
Details for the file strands_agents-1.11.0-py3-none-any.whl.
File metadata
- Download URL: strands_agents-1.11.0-py3-none-any.whl
- Upload date:
- Size: 213.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ff698d5cc73fbeb9993c11e7caa0b74d5a822faf99a65a68d9ef39e10b21158e
|
|
| MD5 |
5aa47af0c79947865befc9b61c804145
|
|
| BLAKE2b-256 |
4b9e4d2b0d21505edc4b50c3acf5559832a364d36701c263bcc5735400c68ef3
|
Provenance
The following attestation bundles were made for strands_agents-1.11.0-py3-none-any.whl:
Publisher:
pypi-publish-on-release.yml on strands-agents/sdk-python
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
strands_agents-1.11.0-py3-none-any.whl -
Subject digest:
ff698d5cc73fbeb9993c11e7caa0b74d5a822faf99a65a68d9ef39e10b21158e - Sigstore transparency entry: 592555802
- Sigstore integration time:
-
Permalink:
strands-agents/sdk-python@2a26ffad8bc7379358bc2535d9ce1ec290fea0af -
Branch / Tag:
refs/tags/v1.11.0 - Owner: https://github.com/strands-agents
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
pypi-publish-on-release.yml@2a26ffad8bc7379358bc2535d9ce1ec290fea0af -
Trigger Event:
release
-
Statement type: