Skip to main content

The Customizable AI Dockerfile Generation Framework

Project description

DockAI Logo

DockAI

AI-Powered Dockerfile Generation Framework

Generate production-ready Dockerfiles from first principles using AI agents

PyPI Version Python Version License

Quick StartFeaturesDocsCI/CDContributing


🎯 What is DockAI?

DockAI is an agentic AI framework that analyzes your codebase and generates optimized, production-ready Dockerfiles. Unlike template-based tools, DockAI uses first-principles reasoning to understand your application and create Dockerfiles from scratch—handling everything from standard stacks to legacy systems.

pip install dockai-cli
dockai build /path/to/project

That's it. DockAI handles the rest.


✨ Features

🧠 First-Principles AI

No templates. Analyzes file structures, dependencies, and code patterns to deduce the optimal containerization strategy.

🔄 Self-Correcting Workflow

Builds and tests Dockerfiles in a sandbox. If something fails, AI reflects, learns, and retries with a new approach.

🛡️ Security-First

Built-in Trivy integration scans for vulnerabilities. Enforces non-root users, minimal base images, and hardened configs.

🤖 10 Specialized Agents

Each agent handles a specific task: analysis, planning, generation, review, and more. All fully customizable.

⚡ Multi-Provider LLMs

Supports OpenAI, Azure, Gemini, Anthropic, and Ollama. Mix and match providers per agent (e.g., OpenAI for analysis, Ollama for generation).

🔧 Fully Customizable

Override prompts, instructions, and model selection per agent. Use .dockai files for repo-specific configs.

📦 Smart Registry Integration

Automatically validates base images against Docker Hub, GCR, Quay, and GHCR. Prioritizes small, secure variants like alpine and slim.

🚀 Performance Optimized

Intelligent caching prevents redundant network calls. Semantic version sorting ensures you always get the latest stable releases.


🚀 Three Ways to Use DockAI

DockAI is designed to fit into any workflow, whether you are a developer, a DevOps engineer, or an AI user.

1. The CLI (For Developers)

Perfect for running locally on your machine.

# Install
pip install dockai-cli

# Run
dockai build .

2. GitHub Actions (For CI/CD)

Automate Dockerfile generation in your pipelines.

steps:
  - uses: actions/checkout@v3
  - uses: itzzjb/dockai@v3
    with:
      openai_api_key: ${{ secrets.OPENAI_API_KEY }}

3. MCP Server (For AI Agents)

Use DockAI directly inside Claude Desktop, Cursor, or any MCP-compliant tool.

  1. Install dockai-cli.
  2. Configure your MCP client:
{
  "mcpServers": {
    "dockai": {
      "command": "python",
      "args": ["-m", "dockai.core.mcp_server"]
    }
  }
}
  1. Ask your AI: "Analyze this project and generate a Dockerfile for it."

Configuration

Create a .env file:

# Required: Choose your LLM provider and add the API key
OPENAI_API_KEY=sk-your-api-key

# Optional: Use a different provider (openai, azure, gemini, anthropic, ollama)
# DOCKAI_LLM_PROVIDER=openai

Usage

# Generate Dockerfile for your project
dockai build /path/to/project

# With verbose output
dockai build /path/to/project --verbose

🏗️ How It Works

graph TD
    Start([▶ Start]) --> Scan[📂 Scanner]
    Scan --> Analyze[🧠 Analyzer]
    Analyze --> Read[📖 Reader]
    Read --> Health[🏥 Health Detector]
    Health --> Ready[⏱️ Readiness Detector]
    Ready --> Plan[📝 Planner]
    Plan --> Generate[⚙️ Generator]
    Generate --> Review[🔒 Security Reviewer]
    
    Review -- Secure --> Validate[✅ Validator]
    Review -- Issues & Can Retry --> Reflect[🤔 Reflector]
    Review -- Critical & Max Retries --> Fail([❌ Fail])
    
    Validate -- Success --> End([🏁 Finish])
    Validate -- Failure --> Reflect
    
    Reflect --> Increment[🔄 Increment Retry]
    
    Increment -- Fix Code --> Generate
    Increment -- New Strategy --> Plan
    Increment -- Re-Analyze --> Analyze
    Increment -- Max Retries --> Fail

🤖 The 10 AI Agents

Agent Role Model Type
Analyzer Project discovery & stack detection Fast
Planner Strategic build planning Fast
Generator Dockerfile creation Powerful
Generator (Iterative) Debugging failed Dockerfiles Powerful
Reviewer Security audit & hardening Fast
Reflector Failure analysis & learning Powerful
Health Detector Health endpoint discovery Fast
Readiness Detector Startup pattern analysis Fast
Error Analyzer Error classification Fast
Iterative Improver Targeted fix application Powerful

⚙️ Configuration

Environment Variables

Variable Description Default
OPENAI_API_KEY OpenAI API key Required*
GOOGLE_API_KEY Google Gemini API key Required*
ANTHROPIC_API_KEY Anthropic Claude API key Required*
AZURE_OPENAI_API_KEY Azure OpenAI API key Required*
OLLAMA_BASE_URL Ollama Base URL http://localhost:11434
DOCKAI_LLM_PROVIDER Provider (openai, azure, gemini, anthropic, ollama) openai
MAX_RETRIES Maximum retry attempts 3
DOCKAI_SKIP_SECURITY_SCAN Skip Trivy scanning false
DOCKAI_TRUNCATION_ENABLED Enable file truncation false
DOCKAI_TOKEN_LIMIT Token limit for auto-truncation 100000
DOCKAI_MAX_FILE_CHARS Max chars per file (when truncating) 200000
DOCKAI_MAX_FILE_LINES Max lines per file (when truncating) 5000

*Only one API key required for your chosen provider.

Repository-Level Configuration

Create a .dockai file in your project root:

[instructions_analyzer]
This is a Django application with Celery workers.

[instructions_generator]
Use gunicorn as the WSGI server.
Run database migrations at container start.

[instructions_reviewer]
All containers must run as non-root (UID >= 10000).

🔗 GitHub Actions

name: Auto-Dockerize

on:
  push:
    branches: [main]

jobs:
  dockai:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - uses: itzzjb/dockai@v3
        with:
          openai_api_key: ${{ secrets.OPENAI_API_KEY }}

💡 Tip: By default, the Dockerfile is generated at runtime and not committed. If you want to save it to your repository, see the Committing Generated Dockerfile guide.

Multi-Provider Example

- uses: itzzjb/dockai@v3
  with:
    llm_provider: gemini
    google_api_key: ${{ secrets.GOOGLE_API_KEY }}
    max_retries: 5
    strict_security: true

Mixed Provider Example

Use Ollama locally for most tasks, but OpenAI for complex analysis:

# .env
DOCKAI_LLM_PROVIDER=ollama
DOCKAI_MODEL_ANALYZER=openai/gpt-4o-mini

See GitHub Actions Guide for all options.


📖 Documentation

Document Description
Getting Started Installation, configuration, first run
Architecture Deep dive into the internal design
Configuration Full reference for env vars and inputs
Customization Tuning agents for your organization
API Reference Module and function documentation
GitHub Actions CI/CD integration guide
MCP Server AI Agent integration guide
FAQ Frequently asked questions

💡 MCP Support: Expose DockAI as a Model Context Protocol server for use in any MCP client.


🛠️ Tech Stack

Technology Purpose
Python 3.10+ Core runtime
LangGraph Stateful agent workflow orchestration
LangChain LLM provider integration
Pydantic Structured output validation
Rich + Typer Beautiful CLI interface
Trivy Security vulnerability scanning

🤝 Contributing

Contributions are welcome! Feel free to open issues and pull requests.


📄 License

MIT License - see LICENSE for details.


Built with ❤️ by itzzjb

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dockai_cli-3.0.2.tar.gz (98.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

dockai_cli-3.0.2-py3-none-any.whl (84.9 kB view details)

Uploaded Python 3

File details

Details for the file dockai_cli-3.0.2.tar.gz.

File metadata

  • Download URL: dockai_cli-3.0.2.tar.gz
  • Upload date:
  • Size: 98.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for dockai_cli-3.0.2.tar.gz
Algorithm Hash digest
SHA256 2a4d5a6415911d9b7562d6d04f36e29dcf9189d133ad9bb0a87aad05a368f07a
MD5 3407c5cbc43fd1c3416185436bd9d59b
BLAKE2b-256 3bc9c7bc3783c97d74a7e1f3e570670b9ef69d34b09a050c87394a1f224a410b

See more details on using hashes here.

File details

Details for the file dockai_cli-3.0.2-py3-none-any.whl.

File metadata

  • Download URL: dockai_cli-3.0.2-py3-none-any.whl
  • Upload date:
  • Size: 84.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for dockai_cli-3.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 44f9454601db8b0ab86027a8f3838869c4681368da9a1c5af3e97153e4d592ca
MD5 4a370802423d746faebef7ec126c8691
BLAKE2b-256 94ae1e6248fb6d1a9963e9e8962fa9bffa62e39bfe73a6f7565612ce58f68247

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page