Skip to main content

Create resource on the cloud with natural language

Project description

🚀 SkyBot

PyPI version Documentation Status codecov

Create resource on the cloud with natural language using AI-powered Terraform generation

📖 Features

  • Natural language-based resource creation
  • Support for AWS cloud resources (S3 buckets, EC2 instances, etc.)
  • Local infrastructure development using LocalStack
  • Component-based infrastructure management
  • Interactive chat interface for cloud resources
  • Support for multiple infrastructure components
  • Self-healing infrastructure creation with automatic error fixing

🛠️ Prerequisites

  • Python 3.10 or higher
  • Required packages (to be installed via pip):
    pip install skybot
    
  • Terraform installed:
    brew install terraform
    
  • AWS CLI configured:
    aws configure
    

P.S, make sure to configure the default region as well.

  • OpenAI API key:
    export OPENAI_API_KEY='your_api_key_here'
    
  • For local development:
    docker pull localstack/localstack
    docker run -d -p 4566:4566 localstack/localstack
    

📚 Command Structure

Initialize a new project:

skybot init [--verbose] [--local]

Create a new component:

skybot component create --prompt "Your infrastructure description" --name component-name [--verbose] [--force] [--model MODEL_NAME] [--self-healing] [--max-attempts N] [--keep-on-failure]

Delete all components:

skybot component delete [--force]

Destroy all components infrastructure:

skybot component destroy [--force]

Edit a component:

skybot component edit component-name

Chat about your infrastructure:

skybot chat component-name

Check SkyBot version:

skybot version

📊 Usage Examples

  1. Initialize a new project:
skybot init
  1. Create a web server component with self-healing:
skybot component create --prompt "Create an EC2 instance with nginx installed" --name web-server --self-healing
  1. Create a local S3 bucket for testing:
skybot component create --prompt "Create an S3 bucket" --name test-bucket --local
  1. Create a database component with custom retry attempts:
skybot component create --prompt "Set up an RDS instance for PostgreSQL" --name database --self-healing --max-attempts 5
  1. Chat about your infrastructure:
skybot chat web-server

🗂️ Project Structure

When you initialize a project, SkyBot creates a .skybot directory with the following structure:

.skybot/
└── default/
    ├── backend.tf
    ├── provider.tf
    ├── component1.tf
    ├── component2.tf
    └── ...

Each component is stored as a separate Terraform file in the workspace directory.

🔧 Advanced Features

Self-Healing Infrastructure Creation

SkyBot includes a self-healing feature that automatically fixes Terraform errors during resource creation:

  • Enable with --self-healing flag
  • Set maximum retry attempts with --max-attempts N (default: 3)
  • Uses AI to analyze errors and fix configuration issues
  • Maintains original infrastructure intent while resolving dependencies
  • Shows detailed fix explanations for transparency
  • Use --keep-on-failure to preserve generated Terraform files even when errors occur (useful for debugging)

Example with self-healing:

skybot component create \
  --prompt "Create a highly available EC2 setup with auto-scaling" \
  --name ha-web \
  --self-healing \
  --max-attempts 5 \
  --keep-on-failure

If Terraform encounters errors during plan or apply:

  1. SkyBot analyzes the error output
  2. AI suggests fixes while preserving the original intent
  3. Retries the operation with fixed configuration
  4. Continues until success or max attempts reached
  5. If --keep-on-failure is set, preserves the generated Terraform files for inspection even if errors occur

Langfuse Monitoring

SkyBot supports observability and monitoring of AI interactions through Langfuse:

  • Set up Langfuse credentials:

    export LANGFUSE_PUBLIC_KEY='your_public_key'
    export LANGFUSE_SECRET_KEY='your_secret_key'
    
  • All AI interactions are automatically logged to your Langfuse dashboard

Alternative Models

SkyBot supports multiple AI models for infrastructure generation through LiteLLM integration. While OpenAI is the default provider, you can use other models by setting the appropriate API key and specifying the model:

Note: Even when using alternative models, the OPENAI_API_KEY environment variable is still required for certain auxiliary tasks within SkyBot.

Using Groq Models

export GROQ_API_KEY='your_api_key'
skybot component create \
  --name eks-cluster-1 \
  --prompt "create an EKS cluster named MyKubernetesCluster" \
  --self-healing \
  --model "groq/deepseek-r1-distill-llama-70b"

Using Perplexity Models

export PERPLEXITY_API_KEY='your_api_key'
skybot component create \
  --name eks-cluster-1 \
  --prompt "create an EKS cluster named MyKubernetesCluster" \
  --self-healing \
  --model "perplexity/sonar"

The --model flag allows you to specify which model to use for infrastructure generation. Make sure to set the corresponding API key as an environment variable before running the command.

SkyBot supports all models available through LiteLLM (see LiteLLM Documentation), including but not limited to:

  • OpenAI (default), for instance: gpt-4o, o3-mini
  • Groq, for instance: groq/deepseek-r1-distill-llama-70b
  • Perplexity, for instance: perplexity/sonar-pro
  • Anthropic, for instance: anthropic/claude-3-5-sonnet
  • Google VertexAI
  • AWS Bedrock
  • Azure OpenAI
  • Hugging Face
  • And many more

Each provider requires its own API key to be set as an environment variable. Common examples:

  • OPENAI_API_KEY for OpenAI models (required for all setups)
  • GROQ_API_KEY for Groq models
  • PERPLEXITY_API_KEY for Perplexity models
  • ANTHROPIC_API_KEY for Anthropic models
  • AZURE_API_KEY for Azure OpenAI models

Refer to the LiteLLM documentation for the complete list of supported models and their corresponding environment variables.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

skyforge-0.1.0.tar.gz (19.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

skyforge-0.1.0-py3-none-any.whl (22.7 kB view details)

Uploaded Python 3

File details

Details for the file skyforge-0.1.0.tar.gz.

File metadata

  • Download URL: skyforge-0.1.0.tar.gz
  • Upload date:
  • Size: 19.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.10.15

File hashes

Hashes for skyforge-0.1.0.tar.gz
Algorithm Hash digest
SHA256 19d4e77d0a2f5913b6f8bb53ea523434b24381b91b0b9c345489115883edea9e
MD5 a020f2ffaafc7dcc8008cc72ae6b8b98
BLAKE2b-256 98acff51359227e100c9189ce09061f3ca3d14d8be5e88071d756d99241f1078

See more details on using hashes here.

File details

Details for the file skyforge-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: skyforge-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 22.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.10.15

File hashes

Hashes for skyforge-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 2b8f8a695e7688e2e9faa70dd8bb02e12bcd06124bca79b90b97399aae1504b6
MD5 9ea9e59af8e195da674890b01cf60b54
BLAKE2b-256 ad98c60d7a3b07fe5da6fa7eed4b2629994918a1acd0faad15cb73a494df2857

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page