Skip to main content

AI-Powered Docker Security Analyzer

Project description

GitHub Repo stars License: MIT PyPI version Python Version OWASP Incubator

DockSec

DockSec

AI-powered Docker security scanner that explains vulnerabilities in plain English

Quick StartFeaturesInstallationUsageContributing


OWASP

🏆 Officially recognized as an OWASP Incubator Project

Trusted by the global security community • 14,000+ downloads


What is DockSec?

DockSec is an OWASP Incubator Project that combines traditional Docker security scanners (Trivy, Hadolint, Docker Scout) with AI to provide context-aware security analysis. Instead of dumping 200 CVEs and leaving you to figure it out, DockSec:

  • Prioritizes what actually matters
  • Explains vulnerabilities in plain English
  • Suggests specific fixes for YOUR Dockerfile
  • Generates professional security reports

Think of it as having a security expert review your Dockerfiles.

Why OWASP Recognition Matters

Being recognized as an OWASP Incubator Project means:

  • Peer-reviewed by security professionals
  • Community-driven development and governance
  • Trusted by enterprises and security teams worldwide
  • Open source with transparent security practices
  • Active maintenance and regular updates

Join thousands of developers using DockSec to secure their containers.

How It Works

DockSec Workflow

DockSec workflow: From scanning to actionable insights

DockSec follows a simple pipeline:

  1. Scan - Runs Trivy, Hadolint, and Docker Scout on your images/Dockerfiles
  2. Analyze - AI processes all findings and correlates them with your setup
  3. Recommend - Get plain English explanations with specific line-by-line fixes
  4. Report - Export results in JSON, PDF, HTML, or Markdown formats

Quick Start

# Install
pip install docksec

# Scan your Dockerfile
docksec Dockerfile

# Scan with image analysis
docksec Dockerfile -i myapp:latest

# Scan without AI (no API key needed)
docksec Dockerfile --scan-only

Features

  • Smart Analysis: AI explains what vulnerabilities mean for your specific setup
  • Multiple LLM Providers: Support for OpenAI, Anthropic Claude, Google Gemini, and Ollama (local models)
  • Multiple Scanners: Integrates Trivy, Hadolint, and Docker Scout
  • Security Scoring: Get a 0-100 score to track improvements
  • Multiple Formats: Export reports as HTML, PDF, JSON, or CSV
  • No AI Required: Works offline with --scan-only mode
  • CI/CD Ready: Easy integration into build pipelines

Installation

Requirements: Python 3.12+, Docker (for image scanning)

pip install docksec

For AI features, choose your preferred LLM provider:

OpenAI (Default)

export OPENAI_API_KEY="your-key-here"

Anthropic Claude

export ANTHROPIC_API_KEY="your-key-here"
export LLM_PROVIDER="anthropic"
export LLM_MODEL="claude-3-5-sonnet-20241022"

Google Gemini

export GOOGLE_API_KEY="your-key-here"
export LLM_PROVIDER="google"
export LLM_MODEL="gemini-1.5-pro"

Ollama (Local Models)

# First, install and run Ollama: https://ollama.ai
# Then pull a model: ollama pull llama3.1
export LLM_PROVIDER="ollama"
export LLM_MODEL="llama3.1"
# Optional: customize Ollama URL
export OLLAMA_BASE_URL="http://localhost:11434"

External tools (optional, for full scanning):

# Install Trivy and Hadolint
python -m docksec.setup_external_tools

# Or install manually:
# - Trivy: https://aquasecurity.github.io/trivy/
# - Hadolint: https://github.com/hadolint/hadolint

Usage

Basic Scanning

# Analyze Dockerfile with AI recommendations
docksec Dockerfile

# Scan Dockerfile + Docker image
docksec Dockerfile -i nginx:latest

# Get only scan results (no AI)
docksec Dockerfile --scan-only

# Scan image without Dockerfile
docksec --image-only -i nginx:latest

# Use specific LLM provider and model
docksec Dockerfile --provider anthropic --model claude-3-5-sonnet-20241022

# Use local Ollama model
docksec Dockerfile --provider ollama --model llama3.1

CLI Options

Option Description
dockerfile Path to Dockerfile
-i, --image Docker image to scan
-o, --output Output file path
--provider LLM provider (openai, anthropic, google, ollama)
--model Model name (e.g., gpt-4o, claude-3-5-sonnet-20241022)
--ai-only AI analysis only (no scanning)
--scan-only Scanning only (no AI)
--image-only Scan image without Dockerfile

Configuration

Create a .env file for advanced configuration:

# LLM Provider Configuration
LLM_PROVIDER=openai                    # Options: openai, anthropic, google, ollama
LLM_MODEL=gpt-4o                       # Model to use
LLM_TEMPERATURE=0.0                    # Temperature (0-1)

# API Keys
OPENAI_API_KEY=your-openai-key
ANTHROPIC_API_KEY=your-anthropic-key
GOOGLE_API_KEY=your-google-key

# Ollama Configuration (for local models)
OLLAMA_BASE_URL=http://localhost:11434

# Scanning Configuration
TRIVY_SCAN_TIMEOUT=600
DOCKSEC_DEFAULT_SEVERITY=CRITICAL,HIGH

See full configuration options.

Example Output

🔍 Scanning Dockerfile...
⚠️  Security Score: 45/100

Critical Issues (3):
  • Running as root user (line 12)
  • Hardcoded API key detected (line 23)
  • Using vulnerable base image

💡 AI Recommendations:
  1. Add non-root user: RUN useradd -m appuser && USER appuser
  2. Move secrets to environment variables or build secrets
  3. Update FROM ubuntu:20.04 to ubuntu:22.04 (fixes 12 CVEs)

📊 Full report: results/nginx_latest_report.html

Architecture

Dockerfile → [Trivy + Hadolint + Scout] → AI Analysis → Reports

DockSec runs security scanners locally, then uses AI to:

  1. Combine and deduplicate findings
  2. Assess real-world impact for your context
  3. Generate actionable remediation steps
  4. Calculate security score

Supported AI Providers:

  • OpenAI: GPT-4o, GPT-4 Turbo, GPT-3.5 Turbo
  • Anthropic: Claude 3.5 Sonnet, Claude 3 Opus
  • Google: Gemini 1.5 Pro, Gemini 1.5 Flash
  • Ollama: Llama 3.1, Mistral, Phi-3, and other local models

All scanning happens on your machine. Only scan results (not your code) are sent to the AI provider when using AI features.

Roadmap

  • Multiple LLM provider support (OpenAI, Anthropic, Google, Ollama)
  • Docker Compose support
  • Kubernetes manifest scanning
  • GitHub Actions integration
  • Custom security policies

See open issues or suggest features in discussions.

Contributing

Contributions are welcome! Please see CONTRIBUTING.md for guidelines.

Quick links:

Documentation

Troubleshooting

"No OpenAI API Key provided"
→ Set appropriate API key for your provider (OPENAI_API_KEY, ANTHROPIC_API_KEY, GOOGLE_API_KEY) or use --scan-only mode

"Unsupported LLM provider"
→ Valid providers: openai, anthropic, google, ollama. Set with --provider flag or LLM_PROVIDER env var

"Hadolint not found"
→ Run python -m docksec.setup_external_tools

"Python version not supported"
→ DockSec requires Python 3.12+. Use pyenv install 3.12 to upgrade.

"Connection refused" with Ollama
→ Make sure Ollama is running: ollama serve and the model is pulled: ollama pull llama3.1

"Where are my scan results?"
→ Results are saved to results/ directory in your DockSec installation
→ Customize location: export DOCKSEC_RESULTS_DIR=/custom/path

For more issues, see Troubleshooting Guide.

License

MIT License - see LICENSE for details.

Recognition & Community

DockSec is proud to be an OWASP Incubator Project, recognized by the Open Web Application Security Project for its contribution to application security.

What This Means

  • Vetted by Security Experts: OWASP projects undergo rigorous review
  • Community Trust: Join thousands of security professionals using OWASP tools
  • Enterprise Ready: OWASP recognition provides confidence for enterprise adoption
  • Long-term Sustainability: Backed by a global nonprofit foundation

Links


If DockSec helps you, give it a ⭐ to help others discover it!

Built with ❤️ by Advait Patel

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

docksec-2026.4.2.tar.gz (47.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

docksec-2026.4.2-py3-none-any.whl (44.1 kB view details)

Uploaded Python 3

File details

Details for the file docksec-2026.4.2.tar.gz.

File metadata

  • Download URL: docksec-2026.4.2.tar.gz
  • Upload date:
  • Size: 47.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.13

File hashes

Hashes for docksec-2026.4.2.tar.gz
Algorithm Hash digest
SHA256 d1ff210556ce3f120164a6792fe8ce1c250d8e17180320441b11127a38ddff8f
MD5 480b037e4daab26fcb7266d9fca43ff5
BLAKE2b-256 39e88701927eb91cb9b74941189eb39570af422fb97fd7a12582476792b1feab

See more details on using hashes here.

File details

Details for the file docksec-2026.4.2-py3-none-any.whl.

File metadata

  • Download URL: docksec-2026.4.2-py3-none-any.whl
  • Upload date:
  • Size: 44.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.13

File hashes

Hashes for docksec-2026.4.2-py3-none-any.whl
Algorithm Hash digest
SHA256 4bad4de47146e359e3691c2b28a7fd6d82230da72c487ec2f4bba90d7be95f22
MD5 7d12d969f2545df2138fe95286c0b512
BLAKE2b-256 bc6d74f00356a7e27a6c1c39b82ab49788b1791bbc6ed946e003ce3be5d825be

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page