Skip to main content

Community Edition CLI agent for building RAG pipelines

Project description

RagOps Agent CE (Community Edition)

PyPI version Python 3.12+ License: MIT

An LLM-powered CLI agent that automates the creation and maintenance of Retrieval-Augmented Generation (RAG) pipelines. The agent orchestrates built-in tools and Model Context Protocol (MCP) servers to plan, chunk, and load documents into vector stores.

Built by Donkit AI - Open Source RAG Infrastructure.

Key Features

  • Interactive REPL — Start an interactive session with readline history and autocompletion
  • Checklist-driven workflow — The agent creates project checklists, asks for approval before each step, and tracks progress
  • Multi-language support — Automatically detects and responds in the user's language
  • Session-scoped checklists — Only current session checklists appear in the UI
  • Integrated MCP servers — Built-in support for planning, chunking, and vector loading
  • Docker Compose orchestration — Automated deployment of RAG infrastructure (Qdrant, RAG service)
  • Multiple LLM providers — Supports Vertex AI, OpenAI, Azure OpenAI, Anthropic Claude, Ollama

Installation

From PyPI

pip install donkit-ragops-ce

Quick Start

  1. Configure your LLM provider:
# Choose provider: 'vertexai', 'openai', 'anthropic', 'ollama'
export RAGOPS_LLM_PROVIDER=vertexai

# Vertex AI (Google Cloud)
export RAGOPS_VERTEX_CREDENTIALS=/path/to/service_account.json

# Or OpenAI
export RAGOPS_OPENAI_API_KEY=sk-...

# Or Anthropic
export RAGOPS_ANTHROPIC_API_KEY=sk-ant-...

# Or Ollama (local)
export RAGOPS_OLLAMA_BASE_URL=http://localhost:11434
  1. Start the agent:
donkit-ragops-ce
  1. Tell the agent what you want:
you> Create a RAG pipeline for my documentation in /path/to/docs

The agent will automatically:

  • Create a project structure
  • Generate a configuration plan
  • Chunk your documents
  • Set up Docker Compose with Qdrant and RAG service
  • Load data into the vector store

Usage

Note: The command ragops-agent is also available as an alias for backward compatibility.

The agent starts in interactive REPL mode by default. Use subcommands like ping for specific actions.

Interactive Mode (REPL)

# Start interactive session
donkit-ragops-ce

# With specific provider
donkit-ragops-ce -p vertexai

# With custom model
donkit-ragops-ce -p openai -m gpt-4

Command-line Options

  • -p, --provider — Override LLM provider from settings
  • -m, --model — Specify model name
  • -s, --system — Custom system prompt
  • --show-checklist/--no-checklist — Toggle checklist panel (default: shown)
  • --mcp-command — Add custom MCP server (can be used multiple times)

Subcommands

# Health check
donkit-ragops-ce ping

Environment Variables

  • RAGOPS_LLM_PROVIDER — LLM provider name
  • RAGOPS_LOG_LEVEL — Logging level (default: INFO)
  • RAGOPS_MCP_COMMANDS — Comma-separated list of MCP commands
  • RAGOPS_VERTEX_CREDENTIALS — Path to Vertex AI service account JSON
  • RAGOPS_OPENAI_API_KEY — OpenAI API key
  • RAGOPS_ANTHROPIC_API_KEY — Anthropic API key
  • RAGOPS_OLLAMA_BASE_URL — Ollama server URL

Agent Workflow

The agent follows a structured workflow:

  1. Language Detection — Detects user's language from first message
  2. Project Creation — Creates project directory structure
  3. Checklist Creation — Generates task checklist in user's language
  4. Step-by-Step Execution:
    • Asks for permission before each step
    • Marks item as in_progress
    • Executes the task using appropriate MCP tool
    • Reports results
    • Marks item as completed
  5. Deployment — Sets up Docker Compose infrastructure
  6. Data Loading — Loads documents into vector store

MCP Servers

RagOps Agent CE includes built-in MCP servers:

ragops-rag-planner

Plans RAG pipeline configuration based on requirements.

# Example usage
donkit-ragops-ce --mcp-command "ragops-rag-planner"

Tools:

  • plan_rag_config — Generate RAG configuration from requirements

ragops-chunker

Chunks documents for vector storage.

# Example usage
donkit-ragops-ce --mcp-command "ragops-chunker"

Tools:

  • chunk_documents — Split documents into chunks with configurable strategies
  • list_chunked_files — List processed chunk files

ragops-vectorstore-loader

Loads chunks into vector databases.

# Example usage
donkit-ragops-ce --mcp-command "ragops-vectorstore-loader"

Tools:

  • vectorstore_load — Load documents into Qdrant, Chroma, or Milvus
  • delete_from_vectorstore — Remove documents from vector store

ragops-compose-manager

Manages Docker Compose infrastructure.

# Example usage
donkit-ragops-ce --mcp-command "ragops-compose-manager"

Tools:

  • init_project_compose — Initialize Docker Compose for project
  • compose_up — Start services
  • compose_down — Stop services
  • compose_status — Check service status
  • compose_logs — View service logs

ragops-checklist

Manages project checklists and progress tracking.

Tools:

  • create_checklist — Create new checklist
  • get_checklist — Get current checklist
  • update_checklist_item — Update item status

Examples

Basic RAG Pipeline

donkit-ragops-ce
you> Create a RAG pipeline for customer support docs in ./docs folder

The agent will:

  1. Create project structure
  2. Plan RAG configuration
  3. Chunk documents from ./docs
  4. Set up Qdrant + RAG service
  5. Load data into vector store

Custom Configuration

donkit-ragops-ce -p vertexai -m gemini-1.5-pro
you> Build RAG for legal documents with 1000 token chunks and reranking

Multiple Projects

Each project gets its own:

  • Project directory (projects/<project_id>)
  • Docker Compose setup
  • Vector store collection
  • Configuration

Development

Project Structure

donkit-ragops-ce/
├── src/ragops_agent_ce/
│   ├── agent/          # LLM agent core
│   ├── llm/            # LLM provider integrations
│   ├── mcp/            # MCP servers and client
│   │   └── servers/    # Built-in MCP servers
│   ├── cli.py          # CLI commands
│   └── config.py       # Configuration
├── tests/
└── pyproject.toml

Running Tests

poetry run pytest

Code Quality

# Format code
poetry run ruff format .

# Lint code
poetry run ruff check .

Docker Compose Services

The agent can deploy these services:

Qdrant (Vector Database)

services:
  qdrant:
    image: qdrant/qdrant:latest
    ports:
      - "6333:6333"
      - "6334:6334"

RAG Service

services:
  rag-service:
    image: donkit/rag-service:latest
    ports:
      - "8000:8000"
    environment:
      - DATABASE_URI=http://qdrant:6333
      - CONFIG=<base64-encoded-config>

Architecture

┌─────────────────┐
│  RagOps Agent   │
│     (CLI)       │
└────────┬────────┘
         │
         ├── MCP Servers ───────────────┐
         │   ├── ragops-rag-planner     │
         │   ├── ragops-chunker         │
         │   ├── ragops-vectorstore     │
         │   └── ragops-compose         │
         │                              │
         └── LLM Providers ─────────────┤
             ├── Vertex AI              │
             ├── OpenAI                 │
             ├── Anthropic              │
             └── Ollama                 │
                                        │
                                        ▼
                            ┌──────────────────┐
                            │ Docker Compose   │
                            ├──────────────────┤
                            │ • Qdrant         │
                            │ • RAG Service    │
                            └──────────────────┘

Troubleshooting

MCP Server Connection Issues

If MCP servers fail to start:

# Check MCP server logs
RAGOPS_LOG_LEVEL=DEBUG donkit-ragops-ce

Vector Store Connection

Ensure Docker services are running:

cd projects/<project_id>
docker-compose ps
docker-compose logs qdrant

Credentials Issues

Verify your credentials:

# Vertex AI
gcloud auth application-default print-access-token

# OpenAI
echo $RAGOPS_OPENAI_API_KEY

Contributing

We welcome contributions! Please see CONTRIBUTING.md for details.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Support

Related Projects


Built with ❤️ by Donkit AI

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

donkit_ragops_ce-0.1.2.tar.gz (45.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

donkit_ragops_ce-0.1.2-py3-none-any.whl (58.9 kB view details)

Uploaded Python 3

File details

Details for the file donkit_ragops_ce-0.1.2.tar.gz.

File metadata

  • Download URL: donkit_ragops_ce-0.1.2.tar.gz
  • Upload date:
  • Size: 45.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.4 CPython/3.12.12 Linux/6.14.0-1014-gcp

File hashes

Hashes for donkit_ragops_ce-0.1.2.tar.gz
Algorithm Hash digest
SHA256 86f7fb84a9925c804f46d2ff0f86f38118e45965dce064b78f2d8d2026486956
MD5 336718f7937e0be9f2983053e9074fbd
BLAKE2b-256 5791943cc2c32eecf285245dc22e0d3162d62db81ccf43b3309662694c3a96b2

See more details on using hashes here.

File details

Details for the file donkit_ragops_ce-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: donkit_ragops_ce-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 58.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.4 CPython/3.12.12 Linux/6.14.0-1014-gcp

File hashes

Hashes for donkit_ragops_ce-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 21c12c3b28c927fb7ca925c837b12abb462ceaefa8ca26b07b6befdb2621844d
MD5 83f8edd65a878212160d2b406f9ae5b1
BLAKE2b-256 fc3f01e455d4ab3392ce5cb842a244f5900bb70a3cb7d22210a997cb878e92b4

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page