Create resource on the cloud with natural language
Project description
🚀 InfraBot
Create resource on the cloud with natural language using AI-powered Terraform generation
📖 Features
- Natural language-based resource creation
- Support for AWS cloud resources (S3 buckets, EC2 instances, etc.)
- Local infrastructure development using LocalStack
- Component-based infrastructure management
- Interactive chat interface for cloud resources
- Support for multiple infrastructure components
- Self-healing infrastructure creation with automatic error fixing
🛠️ Prerequisites
- Python 3.10 or higher
- Required packages (to be installed via pip):
pip install infrabot
- Terraform installed:
brew install terraform
- AWS CLI configured:
aws configure
P.S, make sure to configure the default region as well.
- OpenAI API key:
export OPENAI_API_KEY='your_api_key_here'
- For local development:
docker pull localstack/localstack docker run -d -p 4566:4566 localstack/localstack
📚 Command Structure
Initialize a new project:
infrabot init [--verbose] [--local]
Create a new component:
infrabot component create --prompt "Your infrastructure description" --name component-name [--verbose] [--force] [--model MODEL_NAME] [--self-healing] [--max-attempts N] [--keep-on-failure]
Delete all components:
infrabot component delete [--force]
Destroy all components infrastructure:
infrabot component destroy [--force]
Edit a component:
infrabot component edit component-name
Chat about your infrastructure:
infrabot chat component-name
Check InfraBot version:
infrabot version
📊 Usage Examples
- Initialize a new project:
infrabot init
- Create a web server component with self-healing:
infrabot component create --prompt "Create an EC2 instance with nginx installed" --name web-server --self-healing
- Create a local S3 bucket for testing:
infrabot component create --prompt "Create an S3 bucket" --name test-bucket --local
- Create a database component with custom retry attempts:
infrabot component create --prompt "Set up an RDS instance for PostgreSQL" --name database --self-healing --max-attempts 5
- Chat about your infrastructure:
infrabot chat web-server
🗂️ Project Structure
When you initialize a project, InfraBot creates a .infrabot directory with the following structure:
.infrabot/
└── default/
├── backend.tf
├── provider.tf
├── component1.tf
├── component2.tf
└── ...
Each component is stored as a separate Terraform file in the workspace directory.
🔧 Advanced Features
Self-Healing Infrastructure Creation
InfraBot includes a self-healing feature that automatically fixes Terraform errors during resource creation:
- Enable with
--self-healingflag - Set maximum retry attempts with
--max-attempts N(default: 3) - Uses AI to analyze errors and fix configuration issues
- Maintains original infrastructure intent while resolving dependencies
- Shows detailed fix explanations for transparency
- Use
--keep-on-failureto preserve generated Terraform files even when errors occur (useful for debugging)
Example with self-healing:
infrabot component create \
--prompt "Create a highly available EC2 setup with auto-scaling" \
--name ha-web \
--self-healing \
--max-attempts 5 \
--keep-on-failure
If Terraform encounters errors during plan or apply:
- InfraBot analyzes the error output
- AI suggests fixes while preserving the original intent
- Retries the operation with fixed configuration
- Continues until success or max attempts reached
- If
--keep-on-failureis set, preserves the generated Terraform files for inspection even if errors occur
Langfuse Monitoring
InfraBot supports observability and monitoring of AI interactions through Langfuse:
-
Set up Langfuse credentials:
export LANGFUSE_PUBLIC_KEY='your_public_key' export LANGFUSE_SECRET_KEY='your_secret_key'
-
All AI interactions are automatically logged to your Langfuse dashboard
Alternative Models
InfraBot supports multiple AI models for infrastructure generation through LiteLLM integration. While OpenAI is the default provider, you can use other models by setting the appropriate API key and specifying the model:
Note: Even when using alternative models, the OPENAI_API_KEY environment variable is still required for certain auxiliary tasks within InfraBot.
Using Groq Models
export GROQ_API_KEY='your_api_key'
infrabot component create \
--name eks-cluster-1 \
--prompt "create an EKS cluster named MyKubernetesCluster" \
--self-healing \
--model "groq/deepseek-r1-distill-llama-70b"
Using Perplexity Models
export PERPLEXITY_API_KEY='your_api_key'
infrabot component create \
--name eks-cluster-1 \
--prompt "create an EKS cluster named MyKubernetesCluster" \
--self-healing \
--model "perplexity/sonar-pro"
Recommendation: We recommend using the
perplexity/sonar-promodel for its enhanced factuality and accuracy in infrastructure generation.
The --model flag allows you to specify which model to use for infrastructure generation. Make sure to set the corresponding API key as an environment variable before running the command.
InfraBot supports all models available through LiteLLM (see LiteLLM Documentation), including but not limited to:
- OpenAI (default), for instance:
gpt-4o,o3-mini - Groq, for instance:
groq/deepseek-r1-distill-llama-70b - Perplexity, for instance:
perplexity/sonar-pro - Anthropic, for instance:
anthropic/claude-3-5-sonnet - Google VertexAI
- AWS Bedrock
- Azure OpenAI
- Hugging Face
- And many more
Each provider requires its own API key to be set as an environment variable. Common examples:
OPENAI_API_KEYfor OpenAI models (required for all setups)GROQ_API_KEYfor Groq modelsPERPLEXITY_API_KEYfor Perplexity modelsANTHROPIC_API_KEYfor Anthropic modelsAZURE_API_KEYfor Azure OpenAI models
Refer to the LiteLLM documentation for the complete list of supported models and their corresponding environment variables.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file infrabot-0.1.1.tar.gz.
File metadata
- Download URL: infrabot-0.1.1.tar.gz
- Upload date:
- Size: 19.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.0.0 CPython/3.10.15
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
27e5b653d177bf75321a1e562f7b12a7c6eeb5590a2c73f825e6bf0872bff77c
|
|
| MD5 |
90a4d5234a4031edb83181c252b5193b
|
|
| BLAKE2b-256 |
0adaddad04d3ccd36aab7329c19b7a7863185d65c60bf3e56e35a8d213ec34b2
|
File details
Details for the file infrabot-0.1.1-py3-none-any.whl.
File metadata
- Download URL: infrabot-0.1.1-py3-none-any.whl
- Upload date:
- Size: 22.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.0.0 CPython/3.10.15
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
cd8cc05d43bbf53c67419c4ab4a8237c2adea50674ab9520b338b9533e979862
|
|
| MD5 |
0ac649e188b192dfe13acfbfe80f1483
|
|
| BLAKE2b-256 |
435feaef48defa055ac14fc2e4e9b793e30659fb850c22c93ae288a4c73e6a4c
|