Buzzerboy Architecture for Deploying AI-Enhanced Web Applications on AWS LightSail with Bedrock
Project description
BBAWSLightsailMiniAIV1a
AI-enhanced AWS Lightsail Container Service deployment with AWS Bedrock integration.
Overview
BBAWSLightsailMiniAIV1a extends the base BBAWSLightsailMiniV1a stack to add AI capabilities through AWS Bedrock services. It provides infrastructure for AI-powered document processing, embeddings, and chat applications.
Features
- AI-Ready Infrastructure: AWS Bedrock integration for LLM and embedding models
- Document Processing: S3 storage with AI-optimized folder structure
- Training Data Management: Support for URL-based content ingestion
- Containerized Deployment: Lightsail Container Service with AI workloads
- Scalable Storage: S3 buckets for documents, embeddings, and processed content
Installation
pip install BBAWSLightsailMiniAIV1a
Quick Start
from cdktf import App
from BBAWSLightsailMiniAIV1a import BBAWSLightsailMiniAIV1a
app = App()
# Get archetype for configuration
archetype = BBAWSLightsailMiniAIV1a.get_archetype(
product='ai-app',
app='smart-docs',
tier='development',
organization='your-org',
region='us-east-1'
)
# Create AI-enhanced stack
ai_stack = BBAWSLightsailMiniAIV1a(
app, "my-ai-stack",
# Base configuration
project_name=archetype.get_project_name(),
environment=archetype.get_tier(),
region=archetype.get_region(),
secret_name=archetype.get_secret_name(),
profile="default",
# AI configuration
model="anthropic.claude-3-sonnet-20240229-v1:0",
training_urls=[
"https://docs.aws.amazon.com/bedrock/",
"https://your-docs.com"
]
)
archetype.set_stack(ai_stack)
app.synth()
Configuration
AI Models
The stack supports various AWS Bedrock models:
- Claude Models:
anthropic.claude-3-sonnet-20240229-v1:0,anthropic.claude-3-haiku-20240307-v1:0 - Titan Models:
amazon.titan-text-express-v1,amazon.titan-embed-text-v1 - Llama Models:
meta.llama2-13b-chat-v1,meta.llama2-70b-chat-v1
Training Data
Configure training URLs for document ingestion:
training_urls = [
"https://docs.aws.amazon.com/bedrock/",
"https://your-company.com/api-docs",
"https://your-company.com/knowledge-base"
]
Chunking Configuration
Customize document processing:
chunk_size = 2000 # Characters per chunk
chunk_overlap = 400 # Character overlap between chunks
Architecture
The AI stack creates:
-
S3 Buckets:
documents/- Raw document storageembeddings/- Processed embeddingstraining/- Training data from URLs
-
IAM Roles:
- Bedrock access permissions
- S3 read/write permissions
- Container service access
-
Lightsail Container Service:
- AI application deployment
- Auto-scaling configuration
- Custom domain support
Examples
Basic Deployment
# See examples/basic_deployment.py
Advanced Configuration
# See examples/advanced_deployment.py
Environment Variables
The following environment variables are available in your container:
AWS_BEDROCK_MODEL: The configured Bedrock modelAWS_BEDROCK_EMBEDDING_MODEL: The embedding modelS3_DOCUMENTS_BUCKET: Documents bucket nameS3_EMBEDDINGS_BUCKET: Embeddings bucket nameCHUNK_SIZE: Document chunk sizeCHUNK_OVERLAP: Chunk overlap size
Development
Building the Package
python prepare.py
python setup.py build
Running Tests
pytest tests/
Semantic Versioning
This package uses semantic-release for automated versioning:
semantic-release publish
Architecture Flags
Use architecture flags to customize deployment:
ArchitectureFlags = BBAWSLightsailMiniAIV1a.get_architecture_flags()
flags = [
ArchitectureFlags.SKIP_DATABASE._value_, # Skip database for AI-only workload
# Add other flags as needed
]
Dependencies
BBAWSLightsailMiniV1a>=1.0.0- Base Lightsail stackcdktf>=0.20.0- CDK for Terraformconstructs>=10.0.0- AWS CDK constructs
License
MIT License - see LICENSE file for details.
Support
For issues and questions:
- GitHub Issues: Repository issues
- Documentation: Full documentation
Changelog
See CHANGELOG.md for version history and changes.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file bbawslightsailminiaiv1a-1.0.2.tar.gz.
File metadata
- Download URL: bbawslightsailminiaiv1a-1.0.2.tar.gz
- Upload date:
- Size: 9.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.0.1 CPython/3.13.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c68a9e0aface28999d6babe8648193f003758171c672eb3a3dc1ac09bc4ea6f8
|
|
| MD5 |
d0a9cb37390fe55700fc4fbd12b181e9
|
|
| BLAKE2b-256 |
bf028f9cbe2a4d5073058fe61d2ec42324793eff1960821aab6987c6ea74b43b
|
File details
Details for the file bbawslightsailminiaiv1a-1.0.2-py3-none-any.whl.
File metadata
- Download URL: bbawslightsailminiaiv1a-1.0.2-py3-none-any.whl
- Upload date:
- Size: 7.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.0.1 CPython/3.13.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6d9e405828453d9e9de841d25736656e140720fa2bff9d44863e561283fa942c
|
|
| MD5 |
4acb182615826a7556ec1aaf369421aa
|
|
| BLAKE2b-256 |
d3b9d56a3b45fd46c5feb7869759ec94d24e0aa66e0bd46db078e7b85acdfea0
|