Skip to main content

The Customizable AI Dockerfile Generation Framework

Project description

DockAI Logo

DockAI

AI-Powered Dockerfile Generation Framework

Generate production-ready Dockerfiles from first principles using AI agents

PyPI Version Python Version License Build Status

Quick StartFeaturesDocsCI/CDContributing


🎯 What is DockAI?

DockAI is an agentic AI framework that analyzes your codebase and generates optimized, production-ready Dockerfiles. Unlike template-based tools, DockAI uses first-principles reasoning to understand your application and create Dockerfiles from scratch—handling everything from standard stacks to legacy systems.

pip install dockai-cli
dockai build /path/to/project

That's it. DockAI handles the rest.


✨ Features

🧠 First-Principles AI

No templates. Analyzes file structures, dependencies, and code patterns to deduce the optimal containerization strategy.

🔄 Self-Correcting Workflow

Builds and tests Dockerfiles in a sandbox. If something fails, AI reflects, learns, and retries with a new approach.

🛡️ Security-First

Built-in Trivy integration scans for vulnerabilities. Enforces non-root users, minimal base images, and hardened configs.

🤖 10 Specialized Agents

Each agent handles a specific task: analysis, planning, generation, review, and more. All fully customizable.

⚡ Multi-Provider LLMs

Supports OpenAI, Azure OpenAI, Google Gemini, and Anthropic Claude. Mix models per agent for cost optimization.

🔧 Fully Customizable

Override prompts, instructions, and model selection per agent. Use .dockai files for repo-specific configs.


🚀 Three Ways to Use DockAI

DockAI is designed to fit into any workflow, whether you are a developer, a DevOps engineer, or an AI user.

1. The CLI (For Developers)

Perfect for running locally on your machine.

# Install
pip install dockai-cli

# Run
dockai build .

2. GitHub Actions (For CI/CD)

Automate Dockerfile generation in your pipelines.

steps:
  - uses: actions/checkout@v3
  - uses: itzzjb/dockai@v2
    with:
      openai_api_key: ${{ secrets.OPENAI_API_KEY }}

3. MCP Server (For AI Agents)

Use DockAI directly inside Claude Desktop, Cursor, or any MCP-compliant tool.

  1. Install dockai-cli.
  2. Configure your MCP client:
{
  "mcpServers": {
    "dockai": {
      "command": "python",
      "args": ["-m", "dockai.core.mcp_server"]
    }
  }
}
  1. Ask your AI: "Analyze this project and generate a Dockerfile for it."

Configuration

Create a .env file:

# Required: Choose your LLM provider and add the API key
OPENAI_API_KEY=sk-your-api-key

# Optional: Use a different provider (openai, azure, gemini, anthropic)
# DOCKAI_LLM_PROVIDER=openai

Usage

# Generate Dockerfile for your project
dockai build /path/to/project

# With verbose output
dockai build /path/to/project --verbose

🏗️ How It Works

graph TD
    Start([▶ Start]) --> Scan[📂 Scanner]
    Scan --> Analyze[🧠 Analyzer]
    Analyze --> Read[📖 Reader]
    Read --> Health[🏥 Health Detector]
    Health --> Ready[⏱️ Readiness Detector]
    Ready --> Plan[📝 Planner]
    Plan --> Generate[⚙️ Generator]
    Generate --> Review[🔒 Security Reviewer]
    
    Review -- Secure --> Validate[✅ Validator]
    Review -- Issues & Can Retry --> Reflect[🤔 Reflector]
    Review -- Critical & Max Retries --> Fail([❌ Fail])
    
    Validate -- Success --> End([🏁 Finish])
    Validate -- Failure --> Reflect
    
    Reflect --> Increment[🔄 Increment Retry]
    
    Increment -- Fix Code --> Generate
    Increment -- New Strategy --> Plan
    Increment -- Re-Analyze --> Analyze
    Increment -- Max Retries --> Fail

🤖 The 10 AI Agents

Agent Role Model Type
Analyzer Project discovery & stack detection Fast
Planner Strategic build planning Fast
Generator Dockerfile creation Powerful
Generator (Iterative) Debugging failed Dockerfiles Powerful
Reviewer Security audit & hardening Fast
Reflector Failure analysis & learning Powerful
Health Detector Health endpoint discovery Fast
Readiness Detector Startup pattern analysis Fast
Error Analyzer Error classification Fast
Iterative Improver Targeted fix application Powerful

⚙️ Configuration

Environment Variables

Variable Description Default
OPENAI_API_KEY OpenAI API key Required*
GOOGLE_API_KEY Google Gemini API key Required*
ANTHROPIC_API_KEY Anthropic Claude API key Required*
AZURE_OPENAI_API_KEY Azure OpenAI API key Required*
DOCKAI_LLM_PROVIDER Provider (openai, azure, gemini, anthropic) openai
MAX_RETRIES Maximum retry attempts 3
DOCKAI_SKIP_SECURITY_SCAN Skip Trivy scanning false
DOCKAI_MAX_FILE_CHARS Max chars per file 200000
DOCKAI_MAX_FILE_LINES Max lines per file 5000

*Only one API key required for your chosen provider.

Repository-Level Configuration

Create a .dockai file in your project root:

[instructions_analyzer]
This is a Django application with Celery workers.

[instructions_generator]
Use gunicorn as the WSGI server.
Run database migrations at container start.

[instructions_reviewer]
All containers must run as non-root (UID >= 10000).

🔗 GitHub Actions

name: Auto-Dockerize

on:
  push:
    branches: [main]

jobs:
  dockai:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - uses: itzzjb/dockai@v2
        with:
          openai_api_key: ${{ secrets.OPENAI_API_KEY }}

Multi-Provider Example

- uses: itzzjb/dockai@v2
  with:
    llm_provider: gemini
    google_api_key: ${{ secrets.GOOGLE_API_KEY }}
    max_retries: 5
    strict_security: true

See GitHub Actions Guide for all options.


📖 Documentation

Document Description
Getting Started Installation, configuration, first run
Architecture Deep dive into the internal design
Configuration Full reference for env vars and inputs
Customization Tuning agents for your organization
API Reference Module and function documentation
GitHub Actions CI/CD integration guide
MCP Server AI Agent integration guide
FAQ Frequently asked questions

💡 MCP Support: Expose DockAI as a Model Context Protocol server for use in any MCP client.


🛠️ Tech Stack

Technology Purpose
Python 3.10+ Core runtime
LangGraph Stateful agent workflow orchestration
LangChain LLM provider integration
Pydantic Structured output validation
Rich + Typer Beautiful CLI interface
Trivy Security vulnerability scanning

🤝 Contributing

Contributions are welcome! Feel free to open issues and pull requests.


📄 License

MIT License - see LICENSE for details.


Built with ❤️ by itzzjb

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dockai_cli-2.3.0.tar.gz (88.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

dockai_cli-2.3.0-py3-none-any.whl (81.4 kB view details)

Uploaded Python 3

File details

Details for the file dockai_cli-2.3.0.tar.gz.

File metadata

  • Download URL: dockai_cli-2.3.0.tar.gz
  • Upload date:
  • Size: 88.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for dockai_cli-2.3.0.tar.gz
Algorithm Hash digest
SHA256 cce87885ca02800800833b701f7f7af48f699d21427b0c14fe6ba11e7afed3dd
MD5 4221e91b3e15ddaa0c49a90a054bd92a
BLAKE2b-256 7ba87cc0a068f0bc1af29409fc6087c469359c0d964e88ae930f93d14a25bc36

See more details on using hashes here.

File details

Details for the file dockai_cli-2.3.0-py3-none-any.whl.

File metadata

  • Download URL: dockai_cli-2.3.0-py3-none-any.whl
  • Upload date:
  • Size: 81.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for dockai_cli-2.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 16611a845b3e70eb0740e8577218eb7d7397ce4ea603734496f56cab6cad553d
MD5 85295e048a2805f5d5c244caa8a231a2
BLAKE2b-256 38c4652fc9e7dfaa0ec118b653202821dfa0853786205825d1534b5a6d99e730

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page