The Customizable AI Dockerfile Generation Framework
Project description
DockAI
AI-Powered Dockerfile Generation Framework
Generate production-ready Dockerfiles from first principles using AI agents
Quick Start • Features • Docs • CI/CD • Contributing
🎯 What is DockAI?
DockAI is an agentic AI framework that analyzes your codebase and generates optimized, production-ready Dockerfiles. Unlike template-based tools, DockAI uses first-principles reasoning to understand your application and create Dockerfiles from scratch—handling everything from standard stacks to legacy systems.
pip install dockai-cli
dockai build /path/to/project
That's it. DockAI handles the rest.
✨ Features
🧠 First-Principles AINo templates. Analyzes file structures, dependencies, and code patterns to deduce the optimal containerization strategy. |
🔄 Self-Correcting WorkflowBuilds and tests Dockerfiles in a sandbox. If something fails, AI reflects, learns, and retries with a new approach. |
🛡️ Security-FirstBuilt-in Trivy integration scans for vulnerabilities. Enforces non-root users, minimal base images, and hardened configs. |
🤖 10 Specialized AgentsEach agent handles a specific task: analysis, planning, generation, review, and more. All fully customizable. |
⚡ Multi-Provider LLMsSupports OpenAI, Azure, Gemini, Anthropic, and Ollama. Mix and match providers per agent (e.g., OpenAI for analysis, Ollama for generation). |
🔧 Fully CustomizableOverride prompts, instructions, and model selection per agent. Use |
📦 Smart Registry IntegrationAutomatically validates base images against Docker Hub, GCR, Quay, and GHCR. Prioritizes small, secure variants like |
🚀 Performance OptimizedIntelligent caching prevents redundant network calls. Semantic version sorting ensures you always get the latest stable releases. |
🚀 Three Ways to Use DockAI
DockAI is designed to fit into any workflow, whether you are a developer, a DevOps engineer, or an AI user.
1. The CLI (For Developers)
Perfect for running locally on your machine.
# Install
pip install dockai-cli
# Run
dockai build .
2. GitHub Actions (For CI/CD)
Automate Dockerfile generation in your pipelines.
steps:
- uses: actions/checkout@v3
- uses: itzzjb/dockai@v2
with:
openai_api_key: ${{ secrets.OPENAI_API_KEY }}
3. MCP Server (For AI Agents)
Use DockAI directly inside Claude Desktop, Cursor, or any MCP-compliant tool.
- Install
dockai-cli. - Configure your MCP client:
{
"mcpServers": {
"dockai": {
"command": "python",
"args": ["-m", "dockai.core.mcp_server"]
}
}
}
- Ask your AI: "Analyze this project and generate a Dockerfile for it."
Configuration
Create a .env file:
# Required: Choose your LLM provider and add the API key
OPENAI_API_KEY=sk-your-api-key
# Optional: Use a different provider (openai, azure, gemini, anthropic, ollama)
# DOCKAI_LLM_PROVIDER=openai
Usage
# Generate Dockerfile for your project
dockai build /path/to/project
# With verbose output
dockai build /path/to/project --verbose
🏗️ How It Works
graph TD
Start([▶ Start]) --> Scan[📂 Scanner]
Scan --> Analyze[🧠 Analyzer]
Analyze --> Read[📖 Reader]
Read --> Health[🏥 Health Detector]
Health --> Ready[⏱️ Readiness Detector]
Ready --> Plan[📝 Planner]
Plan --> Generate[⚙️ Generator]
Generate --> Review[🔒 Security Reviewer]
Review -- Secure --> Validate[✅ Validator]
Review -- Issues & Can Retry --> Reflect[🤔 Reflector]
Review -- Critical & Max Retries --> Fail([❌ Fail])
Validate -- Success --> End([🏁 Finish])
Validate -- Failure --> Reflect
Reflect --> Increment[🔄 Increment Retry]
Increment -- Fix Code --> Generate
Increment -- New Strategy --> Plan
Increment -- Re-Analyze --> Analyze
Increment -- Max Retries --> Fail
🤖 The 10 AI Agents
| Agent | Role | Model Type |
|---|---|---|
| Analyzer | Project discovery & stack detection | Fast |
| Planner | Strategic build planning | Fast |
| Generator | Dockerfile creation | Powerful |
| Generator (Iterative) | Debugging failed Dockerfiles | Powerful |
| Reviewer | Security audit & hardening | Fast |
| Reflector | Failure analysis & learning | Powerful |
| Health Detector | Health endpoint discovery | Fast |
| Readiness Detector | Startup pattern analysis | Fast |
| Error Analyzer | Error classification | Fast |
| Iterative Improver | Targeted fix application | Powerful |
⚙️ Configuration
Environment Variables
| Variable | Description | Default |
|---|---|---|
OPENAI_API_KEY |
OpenAI API key | Required* |
GOOGLE_API_KEY |
Google Gemini API key | Required* |
ANTHROPIC_API_KEY |
Anthropic Claude API key | Required* |
AZURE_OPENAI_API_KEY |
Azure OpenAI API key | Required* |
OLLAMA_BASE_URL |
Ollama Base URL | http://localhost:11434 |
DOCKAI_LLM_PROVIDER |
Provider (openai, azure, gemini, anthropic, ollama) |
openai |
MAX_RETRIES |
Maximum retry attempts | 3 |
DOCKAI_SKIP_SECURITY_SCAN |
Skip Trivy scanning | false |
DOCKAI_TRUNCATION_ENABLED |
Enable file truncation | false |
DOCKAI_TOKEN_LIMIT |
Token limit for auto-truncation | 100000 |
DOCKAI_MAX_FILE_CHARS |
Max chars per file (when truncating) | 200000 |
DOCKAI_MAX_FILE_LINES |
Max lines per file (when truncating) | 5000 |
*Only one API key required for your chosen provider.
Repository-Level Configuration
Create a .dockai file in your project root:
[instructions_analyzer]
This is a Django application with Celery workers.
[instructions_generator]
Use gunicorn as the WSGI server.
Run database migrations at container start.
[instructions_reviewer]
All containers must run as non-root (UID >= 10000).
🔗 GitHub Actions
name: Auto-Dockerize
on:
push:
branches: [main]
jobs:
dockai:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: itzzjb/dockai@v2
with:
openai_api_key: ${{ secrets.OPENAI_API_KEY }}
Multi-Provider Example
- uses: itzzjb/dockai@v2
with:
llm_provider: gemini
google_api_key: ${{ secrets.GOOGLE_API_KEY }}
max_retries: 5
strict_security: true
Mixed Provider Example
Use Ollama locally for most tasks, but OpenAI for complex analysis:
# .env
DOCKAI_LLM_PROVIDER=ollama
DOCKAI_MODEL_ANALYZER=openai/gpt-4o-mini
See GitHub Actions Guide for all options.
📖 Documentation
| Document | Description |
|---|---|
| Getting Started | Installation, configuration, first run |
| Architecture | Deep dive into the internal design |
| Configuration | Full reference for env vars and inputs |
| Customization | Tuning agents for your organization |
| API Reference | Module and function documentation |
| GitHub Actions | CI/CD integration guide |
| MCP Server | AI Agent integration guide |
| FAQ | Frequently asked questions |
💡 MCP Support: Expose DockAI as a Model Context Protocol server for use in any MCP client.
🛠️ Tech Stack
| Technology | Purpose |
|---|---|
| Python 3.10+ | Core runtime |
| LangGraph | Stateful agent workflow orchestration |
| LangChain | LLM provider integration |
| Pydantic | Structured output validation |
| Rich + Typer | Beautiful CLI interface |
| Trivy | Security vulnerability scanning |
🤝 Contributing
Contributions are welcome! Feel free to open issues and pull requests.
📄 License
MIT License - see LICENSE for details.
Built with ❤️ by itzzjb
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file dockai_cli-2.4.3.tar.gz.
File metadata
- Download URL: dockai_cli-2.4.3.tar.gz
- Upload date:
- Size: 98.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6bfafcbb861b4be3be5c230567a026f37144dce3f20536bb9c038d25abe14863
|
|
| MD5 |
e664fb9191797566303e0dbb1d7b8cb1
|
|
| BLAKE2b-256 |
102c9fe072f4e1466da6124d6596fb6dacba43c3ae24606e7a6e054dbb90053d
|
File details
Details for the file dockai_cli-2.4.3-py3-none-any.whl.
File metadata
- Download URL: dockai_cli-2.4.3-py3-none-any.whl
- Upload date:
- Size: 84.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2db98bf4dc3de6e7903ab093d87737e1177a737d9af76435fba7be858d12f3fc
|
|
| MD5 |
e6bc3f4d4320e565487c68343b8111b3
|
|
| BLAKE2b-256 |
9d86ac3de03a6ca44ad66e3666847568319d92a5fbfc1e3ab3166cc270df9000
|