Skip to main content

AI-powered DevOps CLI assistant

Project description

DevOps Agent

An AI-powered CLI tool to assist with DevOps troubleshooting, Applications with Kubernetes architecture, log analysis, and infrastructure code generation.

Features

  • 📊 Log Analysis: Analyze log files and get actionable insights
  • 💬 Query Interface: Ask questions about DevOps best practices, Terraform, Kubernetes, etc.
  • 🛠️ Template Generation: Generate infrastructure code templates
  • 🤖 AI-Powered: Leverages multiple LLM providers (OpenAI, Anthropic, Gemini, Ollama, vLLM)
  • 🎯 Flexible Provider Selection: Choose your preferred LLM provider and model dynamically
  • 🔒 Self-Hosted Options: Run privately with Ollama or vLLM
  • 🧠 Reasoning Mode: Enable advanced reasoning capabilities for complex queries
  • 🐛 Debug Mode: Troubleshoot agent behavior with detailed logging
  • 💾 Memory Management: Persistent context using Qdrant vector database
  • 🎨 Interactive Mode: Engage in continuous conversations with the agent
  • 📝 Multiple Output Formats: Export results as text, JSON, or Markdown

Installation

# Clone the repository
git clone https://github.com/yourusername/devops-agent.git
cd devops-agent

# Install in development mode
pip install -e .

# Or install from PyPI (when published)
pip install devops-agent

Configuration

LLM API KEYS

# For OpenAI
export OPENAI_API_KEY=YOUR_API_KEY

# For Anthropic Claude
export ANTHROPIC_API_KEY=YOUR_API_KEY

# For Google Gemini
export GEMINI_API_KEY=YOUR_API_KEY

# For Ollama (self-hosted, typically no API key needed)
export OLLAMA_API_KEY=YOUR_API_KEY  # Optional

# For vLLM (self-hosted)
export VLLM_API_KEY=YOUR_API_KEY

Qdrant Config for Agent Memory

(If not configured fall backs to in-memory vector store)

export QDRANT_URL=YOUR QDRANT URL
export QDRANT_API_KEY=YOUR QDRANT API KEY

Usage

Ask Questions

devops-agent run --query "I need terraform script to spin up Azure blob storage"
devops-agent run --query "How to increase my pod memory and CPU in k8s"

Interactive Mode

devops-agent run --interactive
# or
devops-agent run -i

Advanced Options

Choose Your LLM Provider and Model

# Use OpenAI with a specific model
devops-agent run --provider openai --model gpt-4o --query "your question"

# Use Anthropic Claude
devops-agent run --provider anthropic --model claude-sonnet-4-20250514 --query "your question"

# Use Google Gemini
devops-agent run --provider google --model gemini-2.0-flash-exp --query "your question"

# Use Ollama (self-hosted)
devops-agent run --provider ollama --model llama3 --query "your question"

# Use vLLM (self-hosted)
devops-agent run --provider vllm --model your-model-name --query "your question"

Enable Debug Mode

devops-agent run --query "your question" --debug_mode true

Enable Reasoning Mode

devops-agent run --query "your question" --reasoning_enabled true

Combine Multiple Options

# Interactive mode with specific provider, model, and reasoning
devops-agent run -i --provider anthropic --model claude-sonnet-4-20250514 --reasoning_enabled true

# Query with debug mode and custom output
devops-agent run --query "docker setup for microservices" --provider openai --model gpt-4o --debug_mode true --output result.md --format markdown

CLI Options Reference

devops-agent run Options

Option Type Description
--log-file Path Path to log file to analyze
--provider String LLM provider (openai, anthropic, google, ollama, vllm)
--model String Model name (e.g., gpt-4o, claude-sonnet-4-20250514, gemini-2.0-flash-exp)
--query String Query to ask the DevOps agent
--output Path Output file path for saving results
--format Choice Output format: text, json, or markdown (default: text)
--interactive, -i Flag Run in interactive mode for continuous conversation
--debug_mode Boolean Enable debug mode with detailed logging
--reasoning_enabled Boolean Enable reasoning mode for complex problem-solving

Provider-Specific Model Examples

OpenAI:

  • gpt-4o
  • gpt-5-mini
  • gpt-5.1

Anthropic:

  • claude-sonnet-4-20250514
  • claude-sonnet-4-5-20250929
  • claude-3-5-sonnet-20241022

Google:

  • gemini-3-pro
  • gemini-2.5-pro
  • gemini-2.5-flash

Ollama (Self-hosted):

  • granite4:3b
  • qwen3:8b
  • cogito:latest
  • Any model you have pulled locally

vLLM (Self-hosted):

  • Any model served by your vLLM instance

Development

# Install development dependencies
pip install -e ".[dev]"

# Run tests
pytest

# Format code
black devops_agent/
isort devops_agent/

# Lint
flake8 devops_agent/

Project Structure

devops-agent/
├── devops_agent/          # Main package
│   ├── cli.py            # CLI interface
│   ├── core/             # Core functionality
│   ├── templates/        # Template generators
│   ├── utils/            # Utilities
│   └── prompts/          # LLM prompts
└── docs/                 # Documentation

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

Apache2.0 License - see LICENSE file for details

RoadMap

  • Implement log analysis with pattern detection
  • Add Support for MCP to use local file system for quick access
  • Add support for Human-in-the-Loop for more focused and collaborated work
  • Support for custom prompt templates
  • Agent as a Service with privacy first concept

Support

For issues and questions, please open an issue on GitHub.

Special Credits

  • Built with Agno2.0 framework for multi-agent orchestration
  • Uses POML for structured prompt engineering
  • Uses Qdrant for memory management
  • powered by Claude (Anthropic), GPT (OpenAI) and Gemini (Google)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

devops_agent-0.0.3.tar.gz (34.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

devops_agent-0.0.3-py3-none-any.whl (32.3 kB view details)

Uploaded Python 3

File details

Details for the file devops_agent-0.0.3.tar.gz.

File metadata

  • Download URL: devops_agent-0.0.3.tar.gz
  • Upload date:
  • Size: 34.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for devops_agent-0.0.3.tar.gz
Algorithm Hash digest
SHA256 141747bef9bc09fb8c707beac0e8e22523b0d9ba9d0d5a4851923494065b2a70
MD5 9ed39242f2a08e929dd8d7202d20b5a5
BLAKE2b-256 285c1917946ba7fc48b8a40e99d57b7c3f302255e8a334d51a3c5322a9b9ec11

See more details on using hashes here.

File details

Details for the file devops_agent-0.0.3-py3-none-any.whl.

File metadata

  • Download URL: devops_agent-0.0.3-py3-none-any.whl
  • Upload date:
  • Size: 32.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for devops_agent-0.0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 73d217e8e7a11b0bd93381d6d4e4006cd6223f50ae89d72af5efb3d4551f0d18
MD5 d7cb9fbfe38c44eb55d5350385209668
BLAKE2b-256 c6a09bb2ae74cb6dbde0441ed449400ce6fe43870b954749787826e4a174c214

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page