Deploy AI-powered web applications to AWS with a single command
Project description
Three Stars ⭐⭐⭐
The fastest AI agent deployment tool for prototyping on AWS.
Three Stars doesn't require you to set up CDK, wait for CloudFormation stack deployments, or struggle with circular dependency errors. All you need is Python — the same language you use to develop your AI agent. Our promise and principles are as follows.
- Speed — Deployment speed is our top priority
- Steps — Streamlined steps including tool setup and error handling
- Small — Just enough for prototyping
Three Stars provides the sss command (short for speed step small) for quick deployment. Commands are available both as a CLI and as an MCP server for your AI coding agent.
| Command | Description |
|---|---|
sss init <name> |
Scaffold a new project with config, frontend, and agent templates |
sss deploy |
Deploy (or redeploy) the project to AWS — S3, AgentCore, Lambda@Edge, CloudFront |
sss status |
Show deployment status of all AWS resources |
sss destroy |
Tear down all deployed AWS resources |
Quick Start
Prerequisites:
- Python 3.12+
- AWS credentials configured (
aws configure) - Permissions for S3, CloudFront, IAM, Lambda, and Bedrock AgentCore
Use with AI Agents (Recommended)
The easiest way to use three-stars is through an AI coding agent. three-stars provides scaffolding tools for agents like Claude Code.
Add this to your Claude Code MCP settings.
{
"mcpServers": {
"three-stars": {
"command": "uvx",
"args": ["three-stars-mcp"]
}
}
}
Install as CLI
pip install three-stars
This installs the sss command. From zero to deployed in four lines:
pip install three-stars
sss init my-app && cd my-app
sss deploy
# Open the printed CloudFront URL in your browser
How Commands Work
Scaffold a Project : sss init
sss init my-app
cd my-app
This creates the following structure:
my-app/
├── three-stars.yml # Configuration
├── app/ # Frontend — use any framework (React, Vue, etc.)
│ └── index.html
└── agent/ # AI agent (Python)
├── agent.py # Strands Agent with SSE streaming
├── tools.py # MCP tool loader
├── memory.py # AgentCore Memory session manager
└── mcp.json # MCP server configuration
The app/ directory starts with a plain HTML file, but you can replace it with your favorite frontend framework — React, Vue, Svelte, or anything that builds to static files. Just point app.source in the config to your build output directory.
The agent/ directory contains a starter Strands Agent that streams responses as Server-Sent Events. It supports:
- MCP Tools — add tool servers in
agent/mcp.json(stdio and HTTP transports). Environment variable references (${VAR}) and AWS credentials are forwarded automatically. - Conversation Memory — when AgentCore Memory is configured, conversation history is preserved across turns within a session.
three-stars.yml controls your deployment:
name: my-ai-app
region: us-east-1
agent:
source: ./agent
model: us.anthropic.claude-sonnet-4-6 # Any Bedrock model ID
description: "My AI assistant"
app:
source: ./app
index: index.html
api:
prefix: /api
Deploy to AWS : sss deploy
Deploy your application to AWS.
sss deploy
First deploy typically completes in ~5 minutes.
[1/5] S3 storage ready 0:00:01
[2/5] AgentCore ready 0:00:48
[3/5] Lambda@Edge function ready 0:00:04
[4/5] CloudFront distribution deployed 0:00:45
[5/5] AgentCore resource policy set 0:00:02
Post-Deployment Health Check
┌────────────┬───────────────────┬──────────┐
│ Resource │ ID / Name │ Status │
├────────────┼───────────────────┼──────────┤
│ S3 Bucket │ sss-my-app-… │ Active │
│ AgentCore │ rt-abc123 │ Ready │
│ CloudFront │ E1234567890 │ Deployed │
└────────────┴───────────────────┴──────────┘
Deployed successfully!
URL: https://d1234567890.cloudfront.net
Open the URL to see your AI agent chat app — the frontend streams responses in real time with Markdown rendering and tool call indicators.
graph LR
Browser["User Browser"]
CF["CloudFront Distribution<br/>(HTTPS CDN)"]
S3["S3 Bucket<br/>(Static Frontend)"]
Edge["Lambda@Edge<br/>(SigV4 Signing)"]
AC["Bedrock AgentCore<br/>(AI Agent Runtime)"]
Browser --> CF
CF -- "/*" --> S3
CF -- "/api/*" --> Edge --> AC
| Resource | Service | Purpose |
|---|---|---|
| S3 Bucket | Amazon S3 | Frontend static files (private, OAC access) |
| AgentCore Runtime | Bedrock AgentCore | Runs AI agent code with Bedrock model access |
| Lambda@Edge Function | AWS Lambda@Edge | SigV4 signing for API requests to AgentCore |
| CloudFront Distribution | Amazon CloudFront | CDN with HTTPS |
| IAM Roles | AWS IAM | Execution permissions (AgentCore, Lambda@Edge) |
Subsequent deploys are even faster because dependencies are cached and only changed resources update.
sss deploy # ~23 seconds on redeploy
| Flag | Description |
|---|---|
--region |
Override AWS region |
--profile |
AWS CLI profile name |
--yes / -y |
Skip confirmation prompts |
--force |
Recreate all resources from scratch |
--verbose / -v |
Print detailed progress (ARNs, policy names) |
Check Status : sss status
Show deployment status of all AWS resources.
sss status
Use --sync to discover actual resources from AWS and update the local state file:
sss status --sync
| Flag | Description |
|---|---|
--region |
Override AWS region |
--profile |
AWS CLI profile name |
--sync |
Refresh state from AWS before showing status |
Tear Down : sss destroy
Remove all deployed AWS resources.
sss destroy
Use --name to discover and destroy resources by project name when the state file is missing:
sss destroy --name my-app --region us-east-1
| Flag | Description |
|---|---|
--region |
Override AWS region |
--profile |
AWS CLI profile name |
--yes / -y |
Skip confirmation prompt |
--name |
Project name for discovery (when state file is missing) |
--verbose / -v |
Print detailed progress |
Note: Lambda@Edge functions cannot be deleted immediately. AWS cleans up edge replicas asynchronously after the CloudFront distribution is removed, which can take 30–60 minutes. If replicas still exist,
sss destroywill report that the function remains and you can safely re-run the command later to finish cleanup.
Development
To use the MCP server from a local checkout, point to the source directory:
{
"mcpServers": {
"three-stars": {
"command": "uv",
"args": ["--directory", "/path/to/three-stars", "run", "three-stars-mcp"]
}
}
}
# Install in development mode
uv sync
# Run tests
uv run pytest
# Lint
uv run ruff check three_stars/ tests/
# Format
uv run ruff format three_stars/ tests/
License
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file three_stars-0.1.1.tar.gz.
File metadata
- Download URL: three_stars-0.1.1.tar.gz
- Upload date:
- Size: 315.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.7.21
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f27417cfac484a9889f8dba0e65fb81932ef92fb05b2583806ae5d17887b5b31
|
|
| MD5 |
91f9a27a65c8e2e58c16161d5854cf68
|
|
| BLAKE2b-256 |
979a3281ff5c01b4ffbc62ab00a2a469018636cb741cd6ea78592811047d1a68
|
File details
Details for the file three_stars-0.1.1-py3-none-any.whl.
File metadata
- Download URL: three_stars-0.1.1-py3-none-any.whl
- Upload date:
- Size: 42.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.7.21
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b073ebe92185d2d8ad7150268220edac3757510c7da79d591a698de1778aed9a
|
|
| MD5 |
a9b95a9d38f050687f6e11d0ea06ea93
|
|
| BLAKE2b-256 |
6c7d8df6e5f2a96f22cc73af251e2f34bdae8e63f5065ec650a86700e08d0945
|