Skip to content

itzzjb/dockai

Use this GitHub action with your project
Add this Action to an existing workflow or create a new one
View on Marketplace

Repository files navigation

DockAI πŸ³πŸ€–

The Customizable AI Dockerfile Generation Framework

Version Python License LangGraph

DockAI is an intelligent, adaptive framework that generates production-ready Dockerfiles for any project using Large Language Models (LLMs). It goes beyond simple template generation by understanding your codebase through RAG (Retrieval-Augmented Generation), analyzing your project architecture, and iteratively improving Dockerfiles until they pass all security and validation checks.

🌟 Key Features

🧠 Intelligent Context Understanding

  • RAG-Powered Analysis: Uses semantic embeddings (sentence-transformers) to understand your entire codebase
  • AST Code Intelligence: Extracts entry points, ports, environment variables, and framework dependencies automatically
  • Multi-Language Support: Works with JavaScript/TypeScript, Python, Go, Java, Ruby, PHP, .NET, and more

πŸ—οΈ Multi-Agent Architecture

DockAI v4.0 features a sophisticated multi-agent system orchestrated by LangGraph:

  • Analyzer Agent: Project discovery and technology stack detection
  • Blueprint Agent: Architectural planning and runtime configuration
  • Generator Agent: Dockerfile creation with best practices
  • Iterative Generator Agent: Refining existing Dockerfiles based on feedback
  • Reviewer Agent: Security auditing and vulnerability detection
  • Reflector Agent: Failure analysis and adaptive learning
  • Error Analyzer Agent: Classification of build/runtime errors for better recovery
  • Iterative Improver Agent: Surgical fixes based on validation feedback

πŸ”„ Adaptive & Self-Improving

  • Automatic Validation: Builds and tests the Docker image locally
  • Iterative Refinement: Learns from failures and auto-fixes issues (up to configurable retries)
  • Smart Fallback: Reverts to the last working Dockerfile if fixes fail (ignoring non-critical warnings)
  • Smart Reflection: AI analyzes build/runtime errors and adjusts strategy
  • Reanalysis: Detects when fundamental assumptions are wrong and pivots

πŸ”’ Security & Best Practices

  • Hadolint Integration: Strict Dockerfile linting (warnings are treated as errors and auto-fixed)
  • Trivy Security Scanning: Container vulnerability detection
  • AI Security Review: Identifies security anti-patterns (root users, exposed secrets, etc.)
  • Multi-Stage Builds: Optimizes for smaller, more secure images

🎯 Production-Ready Features

  • Health Check Detection: Auto-discovers and configures health endpoints
  • Resource Optimization: Configurable memory, CPU, and process limits
  • Multi-Platform Support: Works with Docker, Podman, and GitHub Actions
  • Observability: OpenTelemetry and LangSmith tracing support

πŸ› οΈ Highly Customizable

  • Multi-LLM Support: OpenAI, Google Gemini, Anthropic Claude, Azure OpenAI, Ollama
  • Per-Agent Model Selection: Choose different models for different tasks (cost vs. quality)
  • Custom Instructions: Override default agent behavior
  • Custom Prompts: Complete control over AI reasoning
  • Environment-Based Configuration: 100+ configuration options via environment variables

πŸ›οΈ Architecture

DockAI v4.0 is built on a modern, agent-based architecture using LangGraph for workflow orchestration:

graph TD
    Start([Start]) --> Scan[Scan Repo]
    Scan --> Analyze[Agent : Analyzer]
    Analyze --> ReadFiles[Read Context]
    ReadFiles --> Blueprint[Agent : Blueprint]
    Blueprint --> Generate[Agent : Generator]
    Generate --> Review[Agent : Reviewer]
    
    Review -->|Secure| Validate[Agent : Validator]
    Review -->|Insecure| CheckRetry{Retry < Max?}
    
    Validate -->|Success| End([Success])
    Validate -->|Fail| CheckRetry
    
    CheckRetry -->|Yes| Reflect[Agent : Reflector]
    CheckRetry -->|No| EndFail([Fail])
    
    Reflect --> IncRetry[Increment Retry]
    IncRetry --> Route{Route Fix}
    
    Route -->|Re-Analyze| Analyze
    Route -->|Re-Plan| Blueprint
    Route -->|Fix Code| Improver[Agent : Iterative Improver]
    Improver --> Review
Loading

Core Components

  • LangGraph Workflow Engine: Orchestrates the agent flow with conditional routing
  • RAG Context Engine: In-memory vector store for semantic code search
  • Multi-Agent System: 8 specialized AI agents for different tasks
  • Validation Pipeline: Docker build, Hadolint, Trivy, and health checks
  • State Management: Centralized state for workflow coordination

For detailed architecture documentation, see docs/architecture.md.

🎯 Three Ways to Use DockAI

DockAI can be integrated into your workflow in multiple ways, depending on your needs:

1️⃣ CLI Tool (Local Development)

Install via pip or uv and use directly from the command line:

Using pip:

pip install dockai-cli
export OPENAI_API_KEY="your-key"
dockai build .

Using uv (faster):

# Install uv
curl -LsSf https://astral.sh/uv/install.sh | sh

# Install DockAI
uv pip install dockai-cli

# Use it
dockai build .

Perfect for:

  • Local development and testing
  • Quick Dockerfile generation
  • Iterating on containerization

2️⃣ GitHub Actions (CI/CD Automation)

Integrate DockAI into your CI/CD pipeline with the GitHub Action:

name: Generate Dockerfile
on: [push]

jobs:
  dockerize:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      
      - name: Generate Dockerfile with DockAI
        uses: itzzjb/dockai@v4
        with:
          openai_api_key: ${{ secrets.OPENAI_API_KEY }}
          project_path: '.'
      
      - name: Commit Dockerfile
        run: |
          git config user.name "DockAI Bot"
          git add Dockerfile
          git commit -m "chore: update Dockerfile"
          git push

Perfect for:

  • Automated Dockerfile updates
  • Multi-service monorepos
  • Continuous integration workflows
  • Team collaboration

Learn more: GitHub Actions Guide

3️⃣ MCP Integration (Model Context Protocol)

Use DockAI as an MCP server with AI assistants like Claude Desktop:

Setup MCP:

{
  "mcpServers": {
    "dockai": {
      "command": "uvx",
      "args": ["dockai-cli"]
    }
  }
}

Usage with Claude Desktop:

You: Can you dockerize this Node.js project?
Claude: [Uses DockAI MCP] I'll generate a Dockerfile for your project...

Perfect for:

  • Interactive AI-assisted development
  • Natural language Dockerfile generation
  • Integration with Claude Desktop, VSCode, and other MCP clients
  • Conversational containerization workflow

Learn more: MCP Integration Guide


πŸš€ Quick Start

Prerequisites

  • Python 3.10 or higher
  • Docker installed and running
  • An API key from at least one LLM provider (OpenAI, Google, Anthropic, etc.)

Installation

Option 1: Install from PyPI (Recommended)

pip install dockai-cli

Option 2: Install with uv (Faster)

uv is an extremely fast Python package installer and resolver:

# Install uv (if not already installed)
curl -LsSf https://astral.sh/uv/install.sh | sh

# Install DockAI
uv pip install dockai-cli

Why use uv?

  • πŸš€ 10-100x faster than pip
  • πŸ“¦ Drop-in replacement for pip
  • πŸ”’ Deterministic dependency resolution

Option 3: Install from Source

git clone https://github.com/itzzjb/dockai.git
cd dockai
pip install -e .

Basic Usage

  1. Set up your API key (choose one provider):
# OpenAI (Default)
export OPENAI_API_KEY="your-api-key-here"

# Or Google Gemini
export GOOGLE_API_KEY="your-api-key-here"
export DOCKAI_LLM_PROVIDER="gemini"

# Or Anthropic Claude
export ANTHROPIC_API_KEY="your-api-key-here"
export DOCKAI_LLM_PROVIDER="anthropic"
  1. Navigate to your project and run DockAI:
cd /path/to/your/project
dockai build .
  1. Done! Your production-ready Dockerfile will be created and validated.

Example Output

πŸ” Scanning project...
βœ“ Found 42 files

🧠 Analyzing project with AI...
βœ“ Detected: Node.js Express application
βœ“ Entry point: src/server.js
βœ“ Dependencies: package.json

πŸ“– Reading files with RAG (10 relevant chunks)...
βœ“ Context retrieved

πŸ—οΈ Creating architectural blueprint...
βœ“ Multi-stage build planned
βœ“ Health endpoint: /health

πŸ”¨ Generating Dockerfile...
βœ“ Dockerfile created

πŸ” Reviewing security...
βœ“ No critical issues found

πŸ§ͺ Validating with Docker...
βœ“ Image built successfully (142 MB)
βœ“ Container started
βœ“ Health check passed

βœ… Dockerfile generated successfully!

πŸ“š Documentation

Comprehensive documentation is available in the docs/ directory:

🎯 Use Cases

Single Project Dockerization

# Automatically detects and handles any project type
cd /path/to/your/project
dockai build .

Polyglot Projects

# Works with multi-language projects
dockai build ./my-fullstack-app

Microservices Architecture

# Generate optimized Dockerfiles for each service
for service in api frontend worker; do
  dockai build ./services/$service
done

Custom Requirements

# Add specific requirements for your organization
export DOCKAI_GENERATOR_INSTRUCTIONS="Always use Alpine Linux and pin all versions. Include MAINTAINER label."
dockai build .

Cost-Optimized Generation

# Use cheaper models for analysis, powerful models for generation
export DOCKAI_MODEL_ANALYZER="gpt-4o-mini"
export DOCKAI_MODEL_GENERATOR="gpt-4o"
dockai build .

βš™οΈ Configuration

DockAI offers extensive configuration through environment variables:

Model Selection

# Use different models for different agents (cost optimization)
export DOCKAI_MODEL_ANALYZER="gpt-4o-mini"      # Fast, cheap model for analysis
export DOCKAI_MODEL_GENERATOR="gpt-4o"          # Powerful model for generation
export DOCKAI_MODEL_REFLECTOR="gemini-1.5-pro"  # Strong reasoning for failure analysis

Security & Validation

export DOCKAI_SKIP_HADOLINT="false"            # Enable Dockerfile linting
export DOCKAI_SKIP_SECURITY_SCAN="false"       # Enable Trivy scanning
export DOCKAI_STRICT_SECURITY="true"           # Fail on any vulnerability
export DOCKAI_MAX_IMAGE_SIZE_MB="500"          # Max acceptable image size

RAG Configuration

export DOCKAI_USE_RAG="true"                   # Enable RAG (default in v4.0)
export DOCKAI_EMBEDDING_MODEL="all-MiniLM-L6-v2"  # Embedding model
export DOCKAI_READ_ALL_FILES="true"            # Read all source files

Retry & Adaptation

export MAX_RETRIES="3"                         # Max attempts to fix failures

For complete configuration options, see docs/configuration.md.

πŸ§ͺ Testing

Run the test suite:

# Install test dependencies
pip install -e ".[test]"

# Run all tests
pytest

# Run with coverage
pytest --cov=src/dockai --cov-report=html

🀝 Contributing

Contributions are welcome! Please see our Contributing Guide for details.

πŸ—ΊοΈ Roadmap

  • Support for Docker Compose generation
  • .dockerignore file generation
  • Multi-stage build optimization advisor
  • Integration with container registries
  • Web UI for interactive generation
  • Plugin system for custom validators

πŸ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.

πŸ™ Acknowledgments

  • LangGraph for the agent orchestration framework
  • LangChain for LLM abstractions and tools
  • Sentence Transformers for efficient embeddings
  • All the open-source projects that make DockAI possible

πŸ“§ Support


Made with ❀️ by Januda Bethmin

⭐ If you find DockAI useful, please give it a star on GitHub!

About

The Customizable AI Framework that generates, validates, and optimizes production-ready Dockerfiles

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published