AI-powered DevOps CLI assistant
Project description
DevOps Agent
An AI-powered CLI tool to assist with DevOps troubleshooting, Applications with Kubernetes architecture, log analysis, and infrastructure code generation.
Features
- 📊 Log Analysis: Analyze log files and get actionable insights
- 💬 Query Interface: Ask questions about DevOps best practices, Terraform, Kubernetes, etc.
- 🛠️ Template Generation: Generate infrastructure code templates
- 🤖 AI-Powered: Leverages multiple LLM providers (OpenAI, Anthropic, Gemini, Ollama, vLLM)
- 🎯 Flexible Provider Selection: Choose your preferred LLM provider and model dynamically
- 🔒 Self-Hosted Options: Run privately with Ollama or vLLM
- 🧠 Reasoning Mode: Enable advanced reasoning capabilities for complex queries
- 🐛 Debug Mode: Troubleshoot agent behavior with detailed logging
- 💾 Memory Management: Persistent context using Qdrant vector database
- 🎨 Interactive Mode: Engage in continuous conversations with the agent
- 📝 Multiple Output Formats: Export results as text, JSON, or Markdown
Installation
# Clone the repository
git clone https://github.com/yourusername/devops-agent.git
cd devops-agent
# Install in development mode
pip install -e .
# Or install from PyPI (when published)
pip install devops-agent
Configuration
LLM API KEYS
# For OpenAI
export OPENAI_API_KEY=YOUR_API_KEY
# For Anthropic Claude
export ANTHROPIC_API_KEY=YOUR_API_KEY
# For Google Gemini
export GEMINI_API_KEY=YOUR_API_KEY
# For Ollama (self-hosted, typically no API key needed)
export OLLAMA_API_KEY=YOUR_API_KEY # Optional
# For vLLM (self-hosted)
export VLLM_API_KEY=YOUR_API_KEY
Qdrant Config for Agent Memory
(If not configured fall backs to in-memory vector store)
export QDRANT_URL=YOUR QDRANT URL
export QDRANT_API_KEY=YOUR QDRANT API KEY
Usage
Ask Questions
devops-agent run --query "I need terraform script to spin up Azure blob storage"
devops-agent run --query "How to increase my pod memory and CPU in k8s"
Interactive Mode
devops-agent run --interactive
# or
devops-agent run -i
Advanced Options
Choose Your LLM Provider and Model
# Use OpenAI with a specific model
devops-agent run --provider openai --model gpt-4o --query "your question"
# Use Anthropic Claude
devops-agent run --provider anthropic --model claude-sonnet-4-20250514 --query "your question"
# Use Google Gemini
devops-agent run --provider google --model gemini-2.0-flash-exp --query "your question"
# Use Ollama (self-hosted)
devops-agent run --provider ollama --model llama3 --query "your question"
# Use vLLM (self-hosted)
devops-agent run --provider vllm --model your-model-name --query "your question"
Enable Debug Mode
devops-agent run --query "your question" --debug_mode true
Enable Reasoning Mode
devops-agent run --query "your question" --reasoning_enabled true
Combine Multiple Options
# Interactive mode with specific provider, model, and reasoning
devops-agent run -i --provider anthropic --model claude-sonnet-4-20250514 --reasoning_enabled true
# Query with debug mode and custom output
devops-agent run --query "docker setup for microservices" --provider openai --model gpt-4o --debug_mode true --output result.md --format markdown
CLI Options Reference
devops-agent run Options
| Option | Type | Description |
|---|---|---|
--log-file |
Path | Path to log file to analyze |
--provider |
String | LLM provider (openai, anthropic, google, ollama, vllm) |
--model |
String | Model name (e.g., gpt-4o, claude-sonnet-4-20250514, gemini-2.0-flash-exp) |
--query |
String | Query to ask the DevOps agent |
--output |
Path | Output file path for saving results |
--format |
Choice | Output format: text, json, or markdown (default: text) |
--interactive, -i |
Flag | Run in interactive mode for continuous conversation |
--debug_mode |
Boolean | Enable debug mode with detailed logging |
--reasoning_enabled |
Boolean | Enable reasoning mode for complex problem-solving |
Provider-Specific Model Examples
OpenAI:
gpt-4ogpt-5-minigpt-5.1
Anthropic:
claude-sonnet-4-20250514claude-sonnet-4-5-20250929claude-3-5-sonnet-20241022
Google:
gemini-3-progemini-2.5-progemini-2.5-flash
Ollama (Self-hosted):
granite4:3bqwen3:8bcogito:latest- Any model you have pulled locally
vLLM (Self-hosted):
- Any model served by your vLLM instance
Development
# Install development dependencies
pip install -e ".[dev]"
# Run tests
pytest
# Format code
black devops_agent/
isort devops_agent/
# Lint
flake8 devops_agent/
Project Structure
devops-agent/
├── devops_agent/ # Main package
│ ├── cli.py # CLI interface
│ ├── core/ # Core functionality
│ ├── templates/ # Template generators
│ ├── utils/ # Utilities
│ └── prompts/ # LLM prompts
└── docs/ # Documentation
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
License
Apache2.0 License - see LICENSE file for details
RoadMap
- Implement log analysis with pattern detection
- Add Support for MCP to use local file system for quick access
- Add support for Human-in-the-Loop for more focused and collaborated work
- Support for custom prompt templates
- Agent as a Service with privacy first concept
Support
For issues and questions, please open an issue on GitHub.
Special Credits
- Built with Agno2.0 framework for multi-agent orchestration
- Uses POML for structured prompt engineering
- Uses Qdrant for memory management
- powered by Claude (Anthropic), GPT (OpenAI) and Gemini (Google)
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file devops_agent-0.0.6.tar.gz.
File metadata
- Download URL: devops_agent-0.0.6.tar.gz
- Upload date:
- Size: 46.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.14
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
5cb00a4b1bad895a65253cbb2f3447bf52899e6adfee96c2886c3a242ad10018
|
|
| MD5 |
1ab37f58849fd48e01150195ee0e6cf6
|
|
| BLAKE2b-256 |
b11b8373e202bfe14f92e0117d27752ace53de06c04bc73a0b3170159e220aa6
|
File details
Details for the file devops_agent-0.0.6-py3-none-any.whl.
File metadata
- Download URL: devops_agent-0.0.6-py3-none-any.whl
- Upload date:
- Size: 49.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.14
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
5574a9d3c94f528d221c1ce89984046732c93d3422e2be4bb4ea4a3d33ac1794
|
|
| MD5 |
9567b71221062669e421bd3ea63d93fc
|
|
| BLAKE2b-256 |
dfd5e58c8c3ed458b850fdcd7333769d50a45fdbfcde83fc843d6461f86dbc02
|